Author: Stephen Situ
This notebook presents an exploration of various aspects of time series modeling, including correlation, inverse correlations, windows and horizons, neural network architectures, and univariate and multivariate modeling. Specifically, we analyze and explore climate and temperature data from Delhi.
One of the main challenges in time series modeling is that the sequence of the data must be preserved. To accomplish this, a sliding window approach is typically used to create accurate models. In this approach, a window of a fixed length (such as 7 days) is used to predict the next horizon (such as 1 day).
The naive model, which simply uses the value at the last time step as the prediction for the next time step, is the most basic time series model. Despite the availability of more complex architectures, we find that the naive model is difficult to beat.
We also explore three different types of neural networks for time series modeling: feedforward dense neural networks, LSTM (long short-term memory) neural networks, and convolutional 1D neural networks. Dense neural networks are particularly useful for capturing long-term dependencies in time series data, while LSTM networks are well-suited for modeling sequences with temporal dependencies that span long intervals. Convolutional 1D neural networks, on the other hand, are effective at capturing local patterns in time series data, such as trends and seasonal patterns.
Univariate time series models consider only a single feature, while multivariate models can take into account multiple features. For multivariate modeling, sliding windows must be created over multiple features.
To use these models for forecasting, the last window in the time series is used to predict the next time step. This prediction is then appended to create a new last window, and the process is repeated until the desired number of time steps is reached.
Original Data: https://www.kaggle.com/datasets/sumanthvrao/daily-climate-time-series-data
# libraries
import numpy as np
import pandas as pd
import tensorflow as tf
# read csv
df_1 = pd.read_csv("DailyDelhiClimateTrain.csv")
df_2 = pd.read_csv("DailyDelhiClimateTest.csv")
# see both dataframes
df_1, df_2
( date meantemp humidity wind_speed meanpressure 0 2013-01-01 10.000000 84.500000 0.000000 1015.666667 1 2013-01-02 7.400000 92.000000 2.980000 1017.800000 2 2013-01-03 7.166667 87.000000 4.633333 1018.666667 3 2013-01-04 8.666667 71.333333 1.233333 1017.166667 4 2013-01-05 6.000000 86.833333 3.700000 1016.500000 ... ... ... ... ... ... 1457 2016-12-28 17.217391 68.043478 3.547826 1015.565217 1458 2016-12-29 15.238095 87.857143 6.000000 1016.904762 1459 2016-12-30 14.095238 89.666667 6.266667 1017.904762 1460 2016-12-31 15.052632 87.000000 7.325000 1016.100000 1461 2017-01-01 10.000000 100.000000 0.000000 1016.000000 [1462 rows x 5 columns], date meantemp humidity wind_speed meanpressure 0 2017-01-01 15.913043 85.869565 2.743478 59.000000 1 2017-01-02 18.500000 77.222222 2.894444 1018.277778 2 2017-01-03 17.111111 81.888889 4.016667 1018.333333 3 2017-01-04 18.700000 70.050000 4.545000 1015.700000 4 2017-01-05 18.388889 74.944444 3.300000 1014.333333 .. ... ... ... ... ... 109 2017-04-20 34.500000 27.500000 5.562500 998.625000 110 2017-04-21 34.250000 39.375000 6.962500 999.875000 111 2017-04-22 32.900000 40.900000 8.890000 1001.600000 112 2017-04-23 32.875000 27.500000 9.962500 1002.125000 113 2017-04-24 32.000000 27.142857 12.157143 1004.142857 [114 rows x 5 columns])
# remove last row of df_1
df_1 = df_1.drop(df_1.index[-1])
# new row
new_row = pd.Series({'date': '2017-01-01' , 'meantemp': (10.000000+15.913043)/2,
'humidity': (100.000000+85.869565)/2 ,'wind_speed': (0.000000+2.743478)/2 ,
'meanpressure': (1016.000000+59.000000)/2})
# set first row of df_2 to new_row
df_2.iloc[0] = new_row
# union and save
df_3 = pd.concat([df_1,df_2])
df_3.to_csv("clean_series.csv",index=False)
df_3
date | meantemp | humidity | wind_speed | meanpressure | |
---|---|---|---|---|---|
0 | 2013-01-01 | 10.000000 | 84.500000 | 0.000000 | 1015.666667 |
1 | 2013-01-02 | 7.400000 | 92.000000 | 2.980000 | 1017.800000 |
2 | 2013-01-03 | 7.166667 | 87.000000 | 4.633333 | 1018.666667 |
3 | 2013-01-04 | 8.666667 | 71.333333 | 1.233333 | 1017.166667 |
4 | 2013-01-05 | 6.000000 | 86.833333 | 3.700000 | 1016.500000 |
... | ... | ... | ... | ... | ... |
109 | 2017-04-20 | 34.500000 | 27.500000 | 5.562500 | 998.625000 |
110 | 2017-04-21 | 34.250000 | 39.375000 | 6.962500 | 999.875000 |
111 | 2017-04-22 | 32.900000 | 40.900000 | 8.890000 | 1001.600000 |
112 | 2017-04-23 | 32.875000 | 27.500000 | 9.962500 | 1002.125000 |
113 | 2017-04-24 | 32.000000 | 27.142857 | 12.157143 | 1004.142857 |
1575 rows × 5 columns
# convert date column to datetime
df_3['date'] = pd.to_datetime(df_3['date'])
# data types
df_3.dtypes
date datetime64[ns] meantemp float64 humidity float64 wind_speed float64 meanpressure float64 dtype: object
df_3
date | meantemp | humidity | wind_speed | meanpressure | |
---|---|---|---|---|---|
0 | 2013-01-01 | 10.000000 | 84.500000 | 0.000000 | 1015.666667 |
1 | 2013-01-02 | 7.400000 | 92.000000 | 2.980000 | 1017.800000 |
2 | 2013-01-03 | 7.166667 | 87.000000 | 4.633333 | 1018.666667 |
3 | 2013-01-04 | 8.666667 | 71.333333 | 1.233333 | 1017.166667 |
4 | 2013-01-05 | 6.000000 | 86.833333 | 3.700000 | 1016.500000 |
... | ... | ... | ... | ... | ... |
109 | 2017-04-20 | 34.500000 | 27.500000 | 5.562500 | 998.625000 |
110 | 2017-04-21 | 34.250000 | 39.375000 | 6.962500 | 999.875000 |
111 | 2017-04-22 | 32.900000 | 40.900000 | 8.890000 | 1001.600000 |
112 | 2017-04-23 | 32.875000 | 27.500000 | 9.962500 | 1002.125000 |
113 | 2017-04-24 | 32.000000 | 27.142857 | 12.157143 | 1004.142857 |
1575 rows × 5 columns
# Mean temp and humidity have inverse correlation
import matplotlib.pyplot as plt
# create a line chart with date as x-axis
df_3.plot(x='date', y=['meantemp', 'humidity'], figsize=(10, 6))
# set chart title and axis labels
plt.title('Temperature and Humidity Over Time')
plt.xlabel('Date')
plt.ylabel('Value')
# display the chart
plt.show()
# Mean temp and windspeed have correlation
import matplotlib.pyplot as plt
# create a line chart with date as x-axis
df_3.plot(x='date', y=['meantemp', 'wind_speed'], figsize=(10, 6))
# set chart title and axis labels
plt.title('Temperature and Wind Speed Over Time')
plt.xlabel('Date')
plt.ylabel('Value')
# display the chart
plt.show()
# Pressure stays mostly constant
import matplotlib.pyplot as plt
# create a line chart with date as x-axis
df_3.plot(x='date', y=['meantemp', 'meanpressure'], figsize=(10, 6))
# set chart title and axis labels
plt.title('Temperature and Mean Pressure Over Time')
plt.xlabel('Date')
plt.ylabel('Value')
# display the chart
plt.show()
# reset index
df_3 = df_3.reset_index()
timesteps = df_3.index.to_numpy()
# Create train and test splits the right way for time series data
split_size = int(0.8 * len(df_3)) # 80% train, 20% test
# Create train data splits (everything before the split)
X_train = timesteps[:split_size]
y_train = df_3['meantemp'][:split_size]
# Create test data splits (everything after the split)
X_test = timesteps[split_size:]
y_test = df_3['meantemp'][split_size:]
# length
len(X_train), len(y_train), len(X_test), len(y_test)
(1260, 1260, 315, 315)
# plotting function
def plot_time_series(timesteps, values, format='.', start=0, end=None, label=None):
"""
Plots a timesteps (a series of points in time) against values (a series of values across timesteps).
Parameters
---------
timesteps : array of timesteps
values : array of values across time
format : style of plot, default "."
start : where to start the plot (setting a value will index from start of timesteps & values)
end : where to end the plot (setting a value will index from end of timesteps & values)
label : label to show on plot of values
"""
# Plot the series
plt.plot(timesteps[start:end], values[start:end], format, label=label)
plt.xlabel("Time")
plt.ylabel("Temp")
if label:
plt.legend(fontsize=14) # make label bigger
plt.grid(True)
# Try out our plotting function
plt.figure(figsize=(10, 7))
plot_time_series(timesteps=X_train, values=y_train, label="Train data")
plot_time_series(timesteps=X_test, values=y_test, label="Test data")
# Create a naïve forecast, remove the last entry from test data
naive_forecast = y_test[:-1] # Naïve forecast equals every value excluding the last value
len(y_test),len(naive_forecast)
# Plot naive forecast
plt.figure(figsize=(10, 7))
plot_time_series(timesteps=X_train, values=y_train, label="Train data")
plot_time_series(timesteps=X_test, values=y_test, label="Test data")
plot_time_series(timesteps=X_test[1:], values=naive_forecast, format="-", label="Naive forecast");
# Naive forecast, shifts graph right
plt.figure(figsize=(10, 7))
offset = 300 # offset the values by 300 timesteps
plot_time_series(timesteps=X_test, values=y_test, start=offset,format="-",label="Test data")
plot_time_series(timesteps=X_test[1:], values=naive_forecast, format="-", start=offset, label="Naive forecast");
# MASE implemented courtesy of sktime - https://github.com/alan-turing-institute/sktime/blob/ee7a06843a44f4aaec7582d847e36073a9ab0566/sktime/performance_metrics/forecasting/_functions.py#L16
def mean_absolute_scaled_error(y_true, y_pred):
"""
Implement MASE (assuming no seasonality of data).
"""
mae = tf.reduce_mean(tf.abs(y_true - y_pred))
# Find MAE of naive forecast (no seasonality)
mae_naive_no_season = tf.reduce_mean(tf.abs(y_true[1:] - y_true[:-1])) # our seasonality is 1 day (hence the shifting of 1 day)
return mae / mae_naive_no_season
#evaluate_preds function
def evaluate_preds(y_true, y_pred):
# Make sure float32 (for metric calculations)
y_true = tf.cast(y_true, dtype=tf.float32)
y_pred = tf.cast(y_pred, dtype=tf.float32)
# Calculate various metrics
mae = tf.keras.metrics.mean_absolute_error(y_true, y_pred)
mse = tf.keras.metrics.mean_squared_error(y_true, y_pred) # puts and emphasis on outliers (all errors get squared)
rmse = tf.sqrt(mse)
mape = tf.keras.metrics.mean_absolute_percentage_error(y_true, y_pred)
mase = mean_absolute_scaled_error(y_true, y_pred)
return {"mae": mae.numpy(),
"mse": mse.numpy(),
"rmse": rmse.numpy(),
"mape": mape.numpy(),
"mase": mase.numpy()}
# naive results
naive_results = evaluate_preds(y_true=y_test[1:],
y_pred=naive_forecast)
naive_results
{'mae': 1.2355828, 'mse': 2.6691272, 'rmse': 1.6337464, 'mape': 5.375559, 'mase': 1.0022618}
# For a neural network problem, we need to create windows and horizons
# set windows and horizon size
WINDOW_SIZE = 7
HORIZON = 1
# function to create windows and labels
def create_windows_and_labels(df, col_name, window_size, horizon):
windows = []
labels = []
for i in range(len(df) - window_size - horizon + 1):
window = df[col_name].iloc[i:i+window_size].values
label = df[col_name].iloc[i+window_size:i+window_size+horizon].values
windows.append(window)
labels.append(label)
return windows, labels
# call function
windows, labels = create_windows_and_labels(df_3,'meantemp',WINDOW_SIZE,HORIZON)
# check first 2 windows and labels
windows[0],labels[0],windows[1],labels[2]
(array([10. , 7.4 , 7.16666667, 8.66666667, 6. , 7. , 7. ]), array([8.85714286]), array([7.4 , 7.16666667, 8.66666667, 6. , 7. , 7. , 8.85714286]), array([11.]))
# check last 2 windows and labels
windows[-2],labels[-2],windows[-1],labels[-1]
(array([31. , 32.55555556, 34. , 33.5 , 34.5 , 34.25 , 32.9 ]), array([32.875]), array([32.55555556, 34. , 33.5 , 34.5 , 34.25 , 32.9 , 32.875 ]), array([32.]))
# Make the train/test splits
def make_train_test_splits(windows, labels, test_split):
"""
Splits matching pairs of windows and labels into train and test splits.
"""
split_size = int(len(windows) * (1-test_split)) # this will default to 80% train/20% test
train_windows = windows[:split_size]
train_labels = labels[:split_size]
test_windows = windows[split_size:]
test_labels = labels[split_size:]
return train_windows, test_windows, train_labels, test_labels
# split winodws and labels
train_windows, test_windows, train_labels, test_labels = make_train_test_splits(windows, labels,test_split=0.2)
len(train_windows), len(test_windows), len(train_labels), len(test_labels)
(1254, 314, 1254, 314)
#Check train set
for i in range(2):
print(train_windows[i])
print(train_labels[i])
for i in range(-3,-1):
print(train_windows[i])
print(train_labels[i])
[10. 7.4 7.16666667 8.66666667 6. 7. 7. ] [8.85714286] [7.4 7.16666667 8.66666667 6. 7. 7. 8.85714286] [14.] [36.16666667 35.42857143 34.625 36.07142857 35.73333333 36.13333333 33.4375 ] [35.5] [35.42857143 34.625 36.07142857 35.73333333 36.13333333 33.4375 35.5 ] [36.]
# Check test set
for i in range(2):
print(test_windows[i])
print(test_labels[i])
for i in range(-3,-1):
print(test_windows[i])
print(test_labels[i])
[36.07142857 35.73333333 36.13333333 33.4375 35.5 36. 32.625 ] [34.73333333] [35.73333333 36.13333333 33.4375 35.5 36. 32.625 34.73333333] [33.5] [31.22222222 31. 32.55555556 34. 33.5 34.5 34.25 ] [32.9] [31. 32.55555556 34. 33.5 34.5 34.25 32.9 ] [32.875]
# convert to numpy array
train_windows = np.array(train_windows)
test_windows = np.array(test_windows)
train_labels = np.array(train_labels)
test_labels = np.array(test_labels)
# check shape
train_windows.shape,train_labels.shape,test_windows.shape,test_labels.shape
((1254, 7), (1254, 1), (314, 7), (314, 1))
# dense model
import tensorflow as tf
from tensorflow.keras import layers
tf.random.set_seed(42)
# Construct model
model_1 = tf.keras.Sequential([
layers.Dense(128, activation="relu"),
layers.Dense(HORIZON, activation="linear")
], name="model_1_dense")
# Compile model
model_1.compile(loss="mae",
optimizer=tf.keras.optimizers.Adam(),
metrics=["mae"])
# Fit model
model_1.fit(x=train_windows,
y=train_labels,
epochs=100,
verbose=1,
batch_size=30,
validation_data=(test_windows, test_labels))
Epoch 1/100 42/42 [==============================] - 1s 4ms/step - loss: 3.7550 - mae: 3.7550 - val_loss: 1.8366 - val_mae: 1.8366 Epoch 2/100 42/42 [==============================] - 0s 2ms/step - loss: 1.5216 - mae: 1.5216 - val_loss: 1.4868 - val_mae: 1.4868 Epoch 3/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4892 - mae: 1.4892 - val_loss: 1.4244 - val_mae: 1.4244 Epoch 4/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4212 - mae: 1.4212 - val_loss: 1.3889 - val_mae: 1.3889 Epoch 5/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3865 - mae: 1.3865 - val_loss: 1.4201 - val_mae: 1.4201 Epoch 6/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3640 - mae: 1.3640 - val_loss: 1.3186 - val_mae: 1.3186 Epoch 7/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3898 - mae: 1.3898 - val_loss: 1.3248 - val_mae: 1.3248 Epoch 8/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3388 - mae: 1.3388 - val_loss: 1.2872 - val_mae: 1.2872 Epoch 9/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3485 - mae: 1.3485 - val_loss: 1.3648 - val_mae: 1.3648 Epoch 10/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3398 - mae: 1.3398 - val_loss: 1.4907 - val_mae: 1.4907 Epoch 11/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3660 - mae: 1.3660 - val_loss: 1.2711 - val_mae: 1.2711 Epoch 12/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2968 - mae: 1.2968 - val_loss: 1.2763 - val_mae: 1.2763 Epoch 13/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3190 - mae: 1.3190 - val_loss: 1.3061 - val_mae: 1.3061 Epoch 14/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2834 - mae: 1.2834 - val_loss: 1.3161 - val_mae: 1.3161 Epoch 15/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2804 - mae: 1.2804 - val_loss: 1.2366 - val_mae: 1.2366 Epoch 16/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3064 - mae: 1.3064 - val_loss: 1.3005 - val_mae: 1.3005 Epoch 17/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2689 - mae: 1.2689 - val_loss: 1.3924 - val_mae: 1.3924 Epoch 18/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3555 - mae: 1.3555 - val_loss: 1.4983 - val_mae: 1.4983 Epoch 19/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3010 - mae: 1.3010 - val_loss: 1.2330 - val_mae: 1.2330 Epoch 20/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2855 - mae: 1.2855 - val_loss: 1.2276 - val_mae: 1.2276 Epoch 21/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2700 - mae: 1.2700 - val_loss: 1.2545 - val_mae: 1.2545 Epoch 22/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2872 - mae: 1.2872 - val_loss: 1.2635 - val_mae: 1.2635 Epoch 23/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2894 - mae: 1.2894 - val_loss: 1.2243 - val_mae: 1.2243 Epoch 24/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2482 - mae: 1.2482 - val_loss: 1.2350 - val_mae: 1.2350 Epoch 25/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2655 - mae: 1.2655 - val_loss: 1.3544 - val_mae: 1.3544 Epoch 26/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2932 - mae: 1.2932 - val_loss: 1.2287 - val_mae: 1.2287 Epoch 27/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2551 - mae: 1.2551 - val_loss: 1.2173 - val_mae: 1.2173 Epoch 28/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2738 - mae: 1.2738 - val_loss: 1.5226 - val_mae: 1.5226 Epoch 29/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3323 - mae: 1.3323 - val_loss: 1.2639 - val_mae: 1.2639 Epoch 30/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2538 - mae: 1.2538 - val_loss: 1.2444 - val_mae: 1.2444 Epoch 31/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2623 - mae: 1.2623 - val_loss: 1.2518 - val_mae: 1.2518 Epoch 32/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2977 - mae: 1.2977 - val_loss: 1.3645 - val_mae: 1.3645 Epoch 33/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3120 - mae: 1.3120 - val_loss: 1.2113 - val_mae: 1.2113 Epoch 34/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2499 - mae: 1.2499 - val_loss: 1.2106 - val_mae: 1.2106 Epoch 35/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2472 - mae: 1.2472 - val_loss: 1.2111 - val_mae: 1.2111 Epoch 36/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2596 - mae: 1.2596 - val_loss: 1.2127 - val_mae: 1.2127 Epoch 37/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2792 - mae: 1.2792 - val_loss: 1.2111 - val_mae: 1.2111 Epoch 38/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2547 - mae: 1.2547 - val_loss: 1.2184 - val_mae: 1.2184 Epoch 39/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2599 - mae: 1.2599 - val_loss: 1.2319 - val_mae: 1.2319 Epoch 40/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2967 - mae: 1.2967 - val_loss: 1.3874 - val_mae: 1.3874 Epoch 41/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2939 - mae: 1.2939 - val_loss: 1.5559 - val_mae: 1.5559 Epoch 42/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3104 - mae: 1.3104 - val_loss: 1.2383 - val_mae: 1.2383 Epoch 43/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2828 - mae: 1.2828 - val_loss: 1.2096 - val_mae: 1.2096 Epoch 44/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2510 - mae: 1.2510 - val_loss: 1.2093 - val_mae: 1.2093 Epoch 45/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2680 - mae: 1.2680 - val_loss: 1.2170 - val_mae: 1.2170 Epoch 46/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2641 - mae: 1.2641 - val_loss: 1.2217 - val_mae: 1.2217 Epoch 47/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2995 - mae: 1.2995 - val_loss: 1.2141 - val_mae: 1.2141 Epoch 48/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2549 - mae: 1.2549 - val_loss: 1.4090 - val_mae: 1.4090 Epoch 49/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2618 - mae: 1.2618 - val_loss: 1.2192 - val_mae: 1.2192 Epoch 50/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2454 - mae: 1.2454 - val_loss: 1.2453 - val_mae: 1.2453 Epoch 51/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2997 - mae: 1.2997 - val_loss: 1.2844 - val_mae: 1.2844 Epoch 52/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2804 - mae: 1.2804 - val_loss: 1.3459 - val_mae: 1.3459 Epoch 53/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2598 - mae: 1.2598 - val_loss: 1.3593 - val_mae: 1.3593 Epoch 54/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2689 - mae: 1.2689 - val_loss: 1.2281 - val_mae: 1.2281 Epoch 55/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2930 - mae: 1.2930 - val_loss: 1.2299 - val_mae: 1.2299 Epoch 56/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2547 - mae: 1.2547 - val_loss: 1.2394 - val_mae: 1.2394 Epoch 57/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2617 - mae: 1.2617 - val_loss: 1.2149 - val_mae: 1.2149 Epoch 58/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2503 - mae: 1.2503 - val_loss: 1.2135 - val_mae: 1.2135 Epoch 59/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2274 - mae: 1.2274 - val_loss: 1.4300 - val_mae: 1.4300 Epoch 60/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2924 - mae: 1.2924 - val_loss: 1.2369 - val_mae: 1.2369 Epoch 61/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2624 - mae: 1.2624 - val_loss: 1.2406 - val_mae: 1.2406 Epoch 62/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2377 - mae: 1.2377 - val_loss: 1.2631 - val_mae: 1.2631 Epoch 63/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2528 - mae: 1.2528 - val_loss: 1.2338 - val_mae: 1.2338 Epoch 64/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2690 - mae: 1.2690 - val_loss: 1.2167 - val_mae: 1.2167 Epoch 65/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2914 - mae: 1.2914 - val_loss: 1.2732 - val_mae: 1.2732 Epoch 66/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3167 - mae: 1.3167 - val_loss: 1.2351 - val_mae: 1.2351 Epoch 67/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2415 - mae: 1.2415 - val_loss: 1.2499 - val_mae: 1.2499 Epoch 68/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2584 - mae: 1.2584 - val_loss: 1.3030 - val_mae: 1.3030 Epoch 69/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2527 - mae: 1.2527 - val_loss: 1.2572 - val_mae: 1.2572 Epoch 70/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2807 - mae: 1.2807 - val_loss: 1.2313 - val_mae: 1.2313 Epoch 71/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2633 - mae: 1.2633 - val_loss: 1.3082 - val_mae: 1.3082 Epoch 72/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2803 - mae: 1.2803 - val_loss: 1.2382 - val_mae: 1.2382 Epoch 73/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2687 - mae: 1.2687 - val_loss: 1.2506 - val_mae: 1.2506 Epoch 74/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2332 - mae: 1.2332 - val_loss: 1.2247 - val_mae: 1.2247 Epoch 75/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2346 - mae: 1.2346 - val_loss: 1.2172 - val_mae: 1.2172 Epoch 76/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2657 - mae: 1.2657 - val_loss: 1.2365 - val_mae: 1.2365 Epoch 77/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2654 - mae: 1.2654 - val_loss: 1.3376 - val_mae: 1.3376 Epoch 78/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2374 - mae: 1.2374 - val_loss: 1.2352 - val_mae: 1.2352 Epoch 79/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2358 - mae: 1.2358 - val_loss: 1.2773 - val_mae: 1.2773 Epoch 80/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2442 - mae: 1.2442 - val_loss: 1.2237 - val_mae: 1.2237 Epoch 81/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2858 - mae: 1.2858 - val_loss: 1.3114 - val_mae: 1.3114 Epoch 82/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2524 - mae: 1.2524 - val_loss: 1.2282 - val_mae: 1.2282 Epoch 83/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2443 - mae: 1.2443 - val_loss: 1.2240 - val_mae: 1.2240 Epoch 84/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2340 - mae: 1.2340 - val_loss: 1.2225 - val_mae: 1.2225 Epoch 85/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2428 - mae: 1.2428 - val_loss: 1.2319 - val_mae: 1.2319 Epoch 86/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2599 - mae: 1.2599 - val_loss: 1.2270 - val_mae: 1.2270 Epoch 87/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2345 - mae: 1.2345 - val_loss: 1.3096 - val_mae: 1.3096 Epoch 88/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2762 - mae: 1.2762 - val_loss: 1.2327 - val_mae: 1.2327 Epoch 89/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2525 - mae: 1.2525 - val_loss: 1.2250 - val_mae: 1.2250 Epoch 90/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2659 - mae: 1.2659 - val_loss: 1.2244 - val_mae: 1.2244 Epoch 91/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2410 - mae: 1.2410 - val_loss: 1.2543 - val_mae: 1.2543 Epoch 92/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2856 - mae: 1.2856 - val_loss: 1.2250 - val_mae: 1.2250 Epoch 93/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3117 - mae: 1.3117 - val_loss: 1.2261 - val_mae: 1.2261 Epoch 94/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2290 - mae: 1.2290 - val_loss: 1.2544 - val_mae: 1.2544 Epoch 95/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2640 - mae: 1.2640 - val_loss: 1.2757 - val_mae: 1.2757 Epoch 96/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2684 - mae: 1.2684 - val_loss: 1.5010 - val_mae: 1.5010 Epoch 97/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2534 - mae: 1.2534 - val_loss: 1.2330 - val_mae: 1.2330 Epoch 98/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2367 - mae: 1.2367 - val_loss: 1.2232 - val_mae: 1.2232 Epoch 99/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2496 - mae: 1.2496 - val_loss: 1.3311 - val_mae: 1.3311 Epoch 100/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2570 - mae: 1.2570 - val_loss: 1.2768 - val_mae: 1.2768
<keras.callbacks.History at 0x1920aa291f0>
# Dense model with different window
HORIZON = 1
WINDOW_SIZE = 30 #
windows, labels = create_windows_and_labels(df_3,'meantemp',WINDOW_SIZE,HORIZON)
train_windows, test_windows, train_labels, test_labels = make_train_test_splits(windows, labels)
len(train_windows), len(test_windows), len(train_labels), len(test_labels)
train_windows = np.array(train_windows)
test_windows = np.array(test_windows)
train_labels = np.array(train_labels)
test_labels = np.array(test_labels)
tf.random.set_seed(42)
model_2 = tf.keras.Sequential([
layers.Dense(128, activation="relu"),
layers.Dense(HORIZON)
], name="model_2_dense")
model_2.compile(loss="mae",
optimizer=tf.keras.optimizers.Adam(),metrics=['mae'])
model_2.fit(train_windows,
train_labels,
epochs=100,
batch_size=30,
verbose=1,
validation_data=(test_windows, test_labels))
Epoch 1/100 42/42 [==============================] - 0s 4ms/step - loss: 4.8584 - mae: 4.8584 - val_loss: 2.1913 - val_mae: 2.1913 Epoch 2/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9928 - mae: 1.9928 - val_loss: 1.9071 - val_mae: 1.9071 Epoch 3/100 42/42 [==============================] - 0s 2ms/step - loss: 1.7455 - mae: 1.7455 - val_loss: 1.6648 - val_mae: 1.6648 Epoch 4/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6366 - mae: 1.6366 - val_loss: 1.9730 - val_mae: 1.9730 Epoch 5/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6070 - mae: 1.6070 - val_loss: 1.7184 - val_mae: 1.7184 Epoch 6/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8934 - mae: 1.8934 - val_loss: 1.8639 - val_mae: 1.8639 Epoch 7/100 42/42 [==============================] - 0s 2ms/step - loss: 1.7404 - mae: 1.7404 - val_loss: 1.5671 - val_mae: 1.5671 Epoch 8/100 42/42 [==============================] - 0s 3ms/step - loss: 1.6069 - mae: 1.6069 - val_loss: 1.6333 - val_mae: 1.6333 Epoch 9/100 42/42 [==============================] - 0s 2ms/step - loss: 1.7210 - mae: 1.7210 - val_loss: 1.5175 - val_mae: 1.5175 Epoch 10/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4976 - mae: 1.4976 - val_loss: 1.4857 - val_mae: 1.4857 Epoch 11/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6361 - mae: 1.6361 - val_loss: 1.4986 - val_mae: 1.4986 Epoch 12/100 42/42 [==============================] - 0s 2ms/step - loss: 1.5266 - mae: 1.5266 - val_loss: 1.7385 - val_mae: 1.7385 Epoch 13/100 42/42 [==============================] - 0s 2ms/step - loss: 1.5356 - mae: 1.5356 - val_loss: 1.4571 - val_mae: 1.4571 Epoch 14/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4809 - mae: 1.4809 - val_loss: 1.9112 - val_mae: 1.9112 Epoch 15/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6176 - mae: 1.6176 - val_loss: 1.4576 - val_mae: 1.4576 Epoch 16/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4754 - mae: 1.4754 - val_loss: 1.6807 - val_mae: 1.6807 Epoch 17/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4805 - mae: 1.4805 - val_loss: 1.4389 - val_mae: 1.4389 Epoch 18/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4445 - mae: 1.4445 - val_loss: 1.4441 - val_mae: 1.4441 Epoch 19/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4240 - mae: 1.4240 - val_loss: 1.5866 - val_mae: 1.5866 Epoch 20/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6151 - mae: 1.6151 - val_loss: 1.4640 - val_mae: 1.4640 Epoch 21/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4960 - mae: 1.4960 - val_loss: 1.3775 - val_mae: 1.3775 Epoch 22/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3858 - mae: 1.3858 - val_loss: 1.4760 - val_mae: 1.4760 Epoch 23/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6475 - mae: 1.6475 - val_loss: 1.3914 - val_mae: 1.3914 Epoch 24/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3496 - mae: 1.3496 - val_loss: 1.4227 - val_mae: 1.4227 Epoch 25/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3845 - mae: 1.3845 - val_loss: 2.1019 - val_mae: 2.1019 Epoch 26/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3682 - mae: 1.3682 - val_loss: 1.4851 - val_mae: 1.4851 Epoch 27/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3634 - mae: 1.3634 - val_loss: 1.6753 - val_mae: 1.6753 Epoch 28/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3893 - mae: 1.3893 - val_loss: 1.4071 - val_mae: 1.4071 Epoch 29/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4201 - mae: 1.4201 - val_loss: 1.4766 - val_mae: 1.4766 Epoch 30/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4531 - mae: 1.4531 - val_loss: 1.3174 - val_mae: 1.3174 Epoch 31/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3839 - mae: 1.3839 - val_loss: 1.7071 - val_mae: 1.7071 Epoch 32/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4052 - mae: 1.4052 - val_loss: 2.9208 - val_mae: 2.9208 Epoch 33/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4337 - mae: 1.4337 - val_loss: 1.3258 - val_mae: 1.3258 Epoch 34/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3800 - mae: 1.3800 - val_loss: 1.4160 - val_mae: 1.4160 Epoch 35/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4021 - mae: 1.4021 - val_loss: 1.3469 - val_mae: 1.3469 Epoch 36/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2918 - mae: 1.2918 - val_loss: 1.4792 - val_mae: 1.4792 Epoch 37/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3505 - mae: 1.3505 - val_loss: 1.3848 - val_mae: 1.3848 Epoch 38/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3411 - mae: 1.3411 - val_loss: 1.2784 - val_mae: 1.2784 Epoch 39/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3273 - mae: 1.3273 - val_loss: 1.2922 - val_mae: 1.2922 Epoch 40/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4674 - mae: 1.4674 - val_loss: 1.3385 - val_mae: 1.3385 Epoch 41/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3003 - mae: 1.3003 - val_loss: 1.2915 - val_mae: 1.2915 Epoch 42/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3013 - mae: 1.3013 - val_loss: 1.3269 - val_mae: 1.3269 Epoch 43/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4477 - mae: 1.4477 - val_loss: 1.3386 - val_mae: 1.3386 Epoch 44/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3731 - mae: 1.3731 - val_loss: 1.5836 - val_mae: 1.5836 Epoch 45/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3553 - mae: 1.3553 - val_loss: 1.3059 - val_mae: 1.3059 Epoch 46/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6233 - mae: 1.6233 - val_loss: 2.7667 - val_mae: 2.7667 Epoch 47/100 42/42 [==============================] - 0s 2ms/step - loss: 1.5485 - mae: 1.5485 - val_loss: 1.9424 - val_mae: 1.9424 Epoch 48/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4773 - mae: 1.4773 - val_loss: 1.7936 - val_mae: 1.7936 Epoch 49/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3385 - mae: 1.3385 - val_loss: 1.6440 - val_mae: 1.6440 Epoch 50/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3795 - mae: 1.3795 - val_loss: 1.2857 - val_mae: 1.2857 Epoch 51/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3616 - mae: 1.3616 - val_loss: 1.2861 - val_mae: 1.2861 Epoch 52/100 42/42 [==============================] - 0s 3ms/step - loss: 1.3533 - mae: 1.3533 - val_loss: 1.5089 - val_mae: 1.5089 Epoch 53/100 42/42 [==============================] - 0s 3ms/step - loss: 1.4561 - mae: 1.4561 - val_loss: 1.2636 - val_mae: 1.2636 Epoch 54/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2992 - mae: 1.2992 - val_loss: 1.5544 - val_mae: 1.5544 Epoch 55/100 42/42 [==============================] - 0s 2ms/step - loss: 1.5730 - mae: 1.5730 - val_loss: 1.5239 - val_mae: 1.5239 Epoch 56/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3403 - mae: 1.3403 - val_loss: 2.0161 - val_mae: 2.0161 Epoch 57/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3624 - mae: 1.3624 - val_loss: 1.2671 - val_mae: 1.2671 Epoch 58/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4560 - mae: 1.4560 - val_loss: 1.6008 - val_mae: 1.6008 Epoch 59/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2753 - mae: 1.2753 - val_loss: 1.2474 - val_mae: 1.2474 Epoch 60/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3120 - mae: 1.3120 - val_loss: 1.9152 - val_mae: 1.9152 Epoch 61/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3972 - mae: 1.3972 - val_loss: 1.6857 - val_mae: 1.6857 Epoch 62/100 42/42 [==============================] - 0s 2ms/step - loss: 1.7499 - mae: 1.7499 - val_loss: 1.7922 - val_mae: 1.7922 Epoch 63/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3505 - mae: 1.3505 - val_loss: 1.2541 - val_mae: 1.2541 Epoch 64/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3423 - mae: 1.3423 - val_loss: 1.3280 - val_mae: 1.3280 Epoch 65/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3182 - mae: 1.3182 - val_loss: 1.4255 - val_mae: 1.4255 Epoch 66/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2946 - mae: 1.2946 - val_loss: 1.2920 - val_mae: 1.2920 Epoch 67/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3575 - mae: 1.3575 - val_loss: 1.3190 - val_mae: 1.3190 Epoch 68/100 42/42 [==============================] - 0s 3ms/step - loss: 1.3775 - mae: 1.3775 - val_loss: 1.2487 - val_mae: 1.2487 Epoch 69/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4190 - mae: 1.4190 - val_loss: 1.3664 - val_mae: 1.3664 Epoch 70/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3131 - mae: 1.3131 - val_loss: 1.2905 - val_mae: 1.2905 Epoch 71/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2627 - mae: 1.2627 - val_loss: 1.2358 - val_mae: 1.2358 Epoch 72/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2604 - mae: 1.2604 - val_loss: 1.8406 - val_mae: 1.8406 Epoch 73/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3996 - mae: 1.3996 - val_loss: 1.3687 - val_mae: 1.3687 Epoch 74/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2690 - mae: 1.2690 - val_loss: 1.3290 - val_mae: 1.3290 Epoch 75/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3079 - mae: 1.3079 - val_loss: 1.2291 - val_mae: 1.2291 Epoch 76/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2548 - mae: 1.2548 - val_loss: 1.4396 - val_mae: 1.4396 Epoch 77/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3024 - mae: 1.3024 - val_loss: 1.2484 - val_mae: 1.2484 Epoch 78/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2838 - mae: 1.2838 - val_loss: 1.2588 - val_mae: 1.2588 Epoch 79/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3182 - mae: 1.3182 - val_loss: 1.3185 - val_mae: 1.3185 Epoch 80/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4324 - mae: 1.4324 - val_loss: 1.7655 - val_mae: 1.7655 Epoch 81/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3913 - mae: 1.3913 - val_loss: 1.3724 - val_mae: 1.3724 Epoch 82/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2571 - mae: 1.2571 - val_loss: 1.5197 - val_mae: 1.5197 Epoch 83/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3263 - mae: 1.3263 - val_loss: 1.2405 - val_mae: 1.2405 Epoch 84/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2735 - mae: 1.2735 - val_loss: 1.5691 - val_mae: 1.5691 Epoch 85/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2842 - mae: 1.2842 - val_loss: 1.2263 - val_mae: 1.2263 Epoch 86/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4154 - mae: 1.4154 - val_loss: 1.5453 - val_mae: 1.5453 Epoch 87/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2964 - mae: 1.2964 - val_loss: 1.6190 - val_mae: 1.6190 Epoch 88/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3637 - mae: 1.3637 - val_loss: 1.2568 - val_mae: 1.2568 Epoch 89/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3332 - mae: 1.3332 - val_loss: 1.2316 - val_mae: 1.2316 Epoch 90/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3037 - mae: 1.3037 - val_loss: 1.3671 - val_mae: 1.3671 Epoch 91/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2861 - mae: 1.2861 - val_loss: 1.3616 - val_mae: 1.3616 Epoch 92/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3232 - mae: 1.3232 - val_loss: 1.2831 - val_mae: 1.2831 Epoch 93/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3012 - mae: 1.3012 - val_loss: 1.6730 - val_mae: 1.6730 Epoch 94/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3883 - mae: 1.3883 - val_loss: 1.2885 - val_mae: 1.2885 Epoch 95/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2892 - mae: 1.2892 - val_loss: 1.4082 - val_mae: 1.4082 Epoch 96/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3345 - mae: 1.3345 - val_loss: 1.3298 - val_mae: 1.3298 Epoch 97/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2734 - mae: 1.2734 - val_loss: 1.2278 - val_mae: 1.2278 Epoch 98/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2995 - mae: 1.2995 - val_loss: 1.2210 - val_mae: 1.2210 Epoch 99/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2427 - mae: 1.2427 - val_loss: 1.9976 - val_mae: 1.9976 Epoch 100/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6560 - mae: 1.6560 - val_loss: 1.4096 - val_mae: 1.4096
<keras.callbacks.History at 0x1920bea1040>
# Dense model with different Window and Horizon
HORIZON = 7
WINDOW_SIZE = 30
windows, labels = create_windows_and_labels(df_3,'meantemp',WINDOW_SIZE,HORIZON)
train_windows, test_windows, train_labels, test_labels = make_train_test_splits(windows, labels)
len(train_windows), len(test_windows), len(train_labels), len(test_labels)
train_windows = np.array(train_windows)
test_windows = np.array(test_windows)
train_labels = np.array(train_labels)
test_labels = np.array(test_labels)
tf.random.set_seed(42)
model_3 = tf.keras.Sequential([
layers.Dense(128, activation="relu"),
layers.Dense(HORIZON)
], name="model_3_dense")
model_3.compile(loss="mae",
optimizer=tf.keras.optimizers.Adam(),metrics=['mae'])
model_3.fit(train_windows,
train_labels,
batch_size=30,
epochs=100,
verbose=1,
validation_data=(test_windows, test_labels))
Epoch 1/100 42/42 [==============================] - 1s 7ms/step - loss: 6.8220 - mae: 6.8220 - val_loss: 3.0622 - val_mae: 3.0622 Epoch 2/100 42/42 [==============================] - 0s 5ms/step - loss: 2.8805 - mae: 2.8805 - val_loss: 2.5576 - val_mae: 2.5576 Epoch 3/100 42/42 [==============================] - 0s 3ms/step - loss: 2.5797 - mae: 2.5797 - val_loss: 2.3416 - val_mae: 2.3416 Epoch 4/100 42/42 [==============================] - 0s 2ms/step - loss: 2.3577 - mae: 2.3577 - val_loss: 2.7286 - val_mae: 2.7286 Epoch 5/100 42/42 [==============================] - 0s 2ms/step - loss: 2.3349 - mae: 2.3349 - val_loss: 2.1539 - val_mae: 2.1539 Epoch 6/100 42/42 [==============================] - 0s 2ms/step - loss: 2.2433 - mae: 2.2433 - val_loss: 2.2067 - val_mae: 2.2067 Epoch 7/100 42/42 [==============================] - 0s 2ms/step - loss: 2.2145 - mae: 2.2145 - val_loss: 2.1537 - val_mae: 2.1537 Epoch 8/100 42/42 [==============================] - 0s 2ms/step - loss: 2.1626 - mae: 2.1626 - val_loss: 2.1453 - val_mae: 2.1453 Epoch 9/100 42/42 [==============================] - 0s 2ms/step - loss: 2.1292 - mae: 2.1292 - val_loss: 2.0801 - val_mae: 2.0801 Epoch 10/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0864 - mae: 2.0864 - val_loss: 2.0782 - val_mae: 2.0782 Epoch 11/100 42/42 [==============================] - 0s 3ms/step - loss: 2.1339 - mae: 2.1339 - val_loss: 2.1067 - val_mae: 2.1067 Epoch 12/100 42/42 [==============================] - 0s 3ms/step - loss: 2.0939 - mae: 2.0939 - val_loss: 2.0123 - val_mae: 2.0123 Epoch 13/100 42/42 [==============================] - 0s 4ms/step - loss: 2.0876 - mae: 2.0876 - val_loss: 2.1374 - val_mae: 2.1374 Epoch 14/100 42/42 [==============================] - 0s 3ms/step - loss: 2.1186 - mae: 2.1186 - val_loss: 2.0359 - val_mae: 2.0359 Epoch 15/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0628 - mae: 2.0628 - val_loss: 2.0056 - val_mae: 2.0056 Epoch 16/100 42/42 [==============================] - 0s 2ms/step - loss: 2.1343 - mae: 2.1343 - val_loss: 2.2267 - val_mae: 2.2267 Epoch 17/100 42/42 [==============================] - 0s 3ms/step - loss: 2.1519 - mae: 2.1519 - val_loss: 1.9592 - val_mae: 1.9592 Epoch 18/100 42/42 [==============================] - 0s 3ms/step - loss: 2.0479 - mae: 2.0479 - val_loss: 2.4298 - val_mae: 2.4298 Epoch 19/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0471 - mae: 2.0471 - val_loss: 2.3142 - val_mae: 2.3142 Epoch 20/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0874 - mae: 2.0874 - val_loss: 1.9741 - val_mae: 1.9741 Epoch 21/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9816 - mae: 1.9816 - val_loss: 1.9666 - val_mae: 1.9666 Epoch 22/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0114 - mae: 2.0114 - val_loss: 2.0926 - val_mae: 2.0926 Epoch 23/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0130 - mae: 2.0130 - val_loss: 1.9460 - val_mae: 1.9460 Epoch 24/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9983 - mae: 1.9983 - val_loss: 1.9018 - val_mae: 1.9018 Epoch 25/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9679 - mae: 1.9679 - val_loss: 1.9603 - val_mae: 1.9603 Epoch 26/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9841 - mae: 1.9841 - val_loss: 1.9986 - val_mae: 1.9986 Epoch 27/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9696 - mae: 1.9696 - val_loss: 2.0020 - val_mae: 2.0020 Epoch 28/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9814 - mae: 1.9814 - val_loss: 1.8831 - val_mae: 1.8831 Epoch 29/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9579 - mae: 1.9579 - val_loss: 2.0072 - val_mae: 2.0072 Epoch 30/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9912 - mae: 1.9912 - val_loss: 1.9765 - val_mae: 1.9765 Epoch 31/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9454 - mae: 1.9454 - val_loss: 1.9311 - val_mae: 1.9311 Epoch 32/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9535 - mae: 1.9535 - val_loss: 2.0136 - val_mae: 2.0136 Epoch 33/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9764 - mae: 1.9764 - val_loss: 1.9737 - val_mae: 1.9737 Epoch 34/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9537 - mae: 1.9537 - val_loss: 1.9338 - val_mae: 1.9338 Epoch 35/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9430 - mae: 1.9430 - val_loss: 1.9216 - val_mae: 1.9216 Epoch 36/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0415 - mae: 2.0415 - val_loss: 2.1617 - val_mae: 2.1617 Epoch 37/100 42/42 [==============================] - 0s 2ms/step - loss: 2.0133 - mae: 2.0133 - val_loss: 1.9673 - val_mae: 1.9673 Epoch 38/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9330 - mae: 1.9330 - val_loss: 1.9382 - val_mae: 1.9382 Epoch 39/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8950 - mae: 1.8950 - val_loss: 1.8628 - val_mae: 1.8628 Epoch 40/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9435 - mae: 1.9435 - val_loss: 1.9363 - val_mae: 1.9363 Epoch 41/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9316 - mae: 1.9316 - val_loss: 1.9471 - val_mae: 1.9471 Epoch 42/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9655 - mae: 1.9655 - val_loss: 1.8900 - val_mae: 1.8900 Epoch 43/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9163 - mae: 1.9163 - val_loss: 1.8670 - val_mae: 1.8670 Epoch 44/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9622 - mae: 1.9622 - val_loss: 1.9663 - val_mae: 1.9663 Epoch 45/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9455 - mae: 1.9455 - val_loss: 1.9068 - val_mae: 1.9068 Epoch 46/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8905 - mae: 1.8905 - val_loss: 1.9491 - val_mae: 1.9491 Epoch 47/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9286 - mae: 1.9286 - val_loss: 2.0715 - val_mae: 2.0715 Epoch 48/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9836 - mae: 1.9836 - val_loss: 1.9443 - val_mae: 1.9443 Epoch 49/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9155 - mae: 1.9155 - val_loss: 1.8939 - val_mae: 1.8939 Epoch 50/100 42/42 [==============================] - 0s 3ms/step - loss: 1.8840 - mae: 1.8840 - val_loss: 1.9863 - val_mae: 1.9863 Epoch 51/100 42/42 [==============================] - 0s 3ms/step - loss: 1.9076 - mae: 1.9076 - val_loss: 1.9021 - val_mae: 1.9021 Epoch 52/100 42/42 [==============================] - 0s 3ms/step - loss: 1.9223 - mae: 1.9223 - val_loss: 1.9330 - val_mae: 1.9330 Epoch 53/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8986 - mae: 1.8986 - val_loss: 1.9191 - val_mae: 1.9191 Epoch 54/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9430 - mae: 1.9430 - val_loss: 1.9273 - val_mae: 1.9273 Epoch 55/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9264 - mae: 1.9264 - val_loss: 1.8624 - val_mae: 1.8624 Epoch 56/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9167 - mae: 1.9167 - val_loss: 1.9942 - val_mae: 1.9942 Epoch 57/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9021 - mae: 1.9021 - val_loss: 1.8642 - val_mae: 1.8642 Epoch 58/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9457 - mae: 1.9457 - val_loss: 2.1294 - val_mae: 2.1294 Epoch 59/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9445 - mae: 1.9445 - val_loss: 2.0392 - val_mae: 2.0392 Epoch 60/100 42/42 [==============================] - 0s 3ms/step - loss: 1.9538 - mae: 1.9538 - val_loss: 1.8907 - val_mae: 1.8907 Epoch 61/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9078 - mae: 1.9078 - val_loss: 1.8493 - val_mae: 1.8493 Epoch 62/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8599 - mae: 1.8599 - val_loss: 1.8373 - val_mae: 1.8373 Epoch 63/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8988 - mae: 1.8988 - val_loss: 2.0297 - val_mae: 2.0297 Epoch 64/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9362 - mae: 1.9362 - val_loss: 1.8806 - val_mae: 1.8806 Epoch 65/100 42/42 [==============================] - 0s 3ms/step - loss: 1.8740 - mae: 1.8740 - val_loss: 1.9543 - val_mae: 1.9543 Epoch 66/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8694 - mae: 1.8694 - val_loss: 1.9333 - val_mae: 1.9333 Epoch 67/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8749 - mae: 1.8749 - val_loss: 1.8399 - val_mae: 1.8399 Epoch 68/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9217 - mae: 1.9217 - val_loss: 2.0309 - val_mae: 2.0309 Epoch 69/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8840 - mae: 1.8840 - val_loss: 1.9615 - val_mae: 1.9615 Epoch 70/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8918 - mae: 1.8918 - val_loss: 1.9125 - val_mae: 1.9125 Epoch 71/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9239 - mae: 1.9239 - val_loss: 1.8718 - val_mae: 1.8718 Epoch 72/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9067 - mae: 1.9067 - val_loss: 2.1414 - val_mae: 2.1414 Epoch 73/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8975 - mae: 1.8975 - val_loss: 1.9625 - val_mae: 1.9625 Epoch 74/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9239 - mae: 1.9239 - val_loss: 1.8503 - val_mae: 1.8503 Epoch 75/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8936 - mae: 1.8936 - val_loss: 1.9197 - val_mae: 1.9197 Epoch 76/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9116 - mae: 1.9116 - val_loss: 1.8405 - val_mae: 1.8405 Epoch 77/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8610 - mae: 1.8610 - val_loss: 1.8533 - val_mae: 1.8533 Epoch 78/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8420 - mae: 1.8420 - val_loss: 1.8959 - val_mae: 1.8959 Epoch 79/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8804 - mae: 1.8804 - val_loss: 1.8888 - val_mae: 1.8888 Epoch 80/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9392 - mae: 1.9392 - val_loss: 1.8943 - val_mae: 1.8943 Epoch 81/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8961 - mae: 1.8961 - val_loss: 1.8406 - val_mae: 1.8406 Epoch 82/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8728 - mae: 1.8728 - val_loss: 2.1039 - val_mae: 2.1039 Epoch 83/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9246 - mae: 1.9246 - val_loss: 1.9288 - val_mae: 1.9288 Epoch 84/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8927 - mae: 1.8927 - val_loss: 1.9475 - val_mae: 1.9475 Epoch 85/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8971 - mae: 1.8971 - val_loss: 1.8901 - val_mae: 1.8901 Epoch 86/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8889 - mae: 1.8889 - val_loss: 1.9699 - val_mae: 1.9699 Epoch 87/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8842 - mae: 1.8842 - val_loss: 2.0372 - val_mae: 2.0372 Epoch 88/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9526 - mae: 1.9526 - val_loss: 2.0288 - val_mae: 2.0288 Epoch 89/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9486 - mae: 1.9486 - val_loss: 1.8655 - val_mae: 1.8655 Epoch 90/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9196 - mae: 1.9196 - val_loss: 2.0735 - val_mae: 2.0735 Epoch 91/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8730 - mae: 1.8730 - val_loss: 1.9150 - val_mae: 1.9150 Epoch 92/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9020 - mae: 1.9020 - val_loss: 1.9519 - val_mae: 1.9519 Epoch 93/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8822 - mae: 1.8822 - val_loss: 1.9364 - val_mae: 1.9364 Epoch 94/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8953 - mae: 1.8953 - val_loss: 1.9243 - val_mae: 1.9243 Epoch 95/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8805 - mae: 1.8805 - val_loss: 1.8348 - val_mae: 1.8348 Epoch 96/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8833 - mae: 1.8833 - val_loss: 1.9952 - val_mae: 1.9952 Epoch 97/100 42/42 [==============================] - 0s 2ms/step - loss: 1.9057 - mae: 1.9057 - val_loss: 2.1457 - val_mae: 2.1457 Epoch 98/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8848 - mae: 1.8848 - val_loss: 1.8215 - val_mae: 1.8215 Epoch 99/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8425 - mae: 1.8425 - val_loss: 1.9473 - val_mae: 1.9473 Epoch 100/100 42/42 [==============================] - 0s 2ms/step - loss: 1.8697 - mae: 1.8697 - val_loss: 2.0381 - val_mae: 2.0381
<keras.callbacks.History at 0x1920d02a0a0>
# Convolutional 1D Neural Network
HORIZON = 1
WINDOW_SIZE = 7
windows, labels = create_windows_and_labels(df_3,'meantemp',WINDOW_SIZE,HORIZON)
train_windows, test_windows, train_labels, test_labels = make_train_test_splits(windows, labels)
len(train_windows), len(test_windows), len(train_labels), len(test_labels)
train_windows = np.array(train_windows)
test_windows = np.array(test_windows)
train_labels = np.array(train_labels)
test_labels = np.array(test_labels)
tf.random.set_seed(42)
# Create model
model_4 = tf.keras.Sequential([
layers.Lambda(lambda x: tf.expand_dims(x, axis=1)),
layers.Conv1D(filters=128, kernel_size=5, padding="causal", activation="relu"),
layers.Dense(HORIZON)
], name="model_4_conv1D")
model_4.compile(loss="mae",
optimizer=tf.keras.optimizers.Adam(),metrics=['mae'])
model_4.fit(train_windows,
train_labels,
batch_size=30,
epochs=100,
verbose=1,
validation_data=(test_windows, test_labels))
Epoch 1/100 42/42 [==============================] - 1s 5ms/step - loss: 5.4571 - mae: 5.4571 - val_loss: 1.6704 - val_mae: 1.6704 Epoch 2/100 42/42 [==============================] - 0s 2ms/step - loss: 1.6178 - mae: 1.6178 - val_loss: 1.5974 - val_mae: 1.5974 Epoch 3/100 42/42 [==============================] - 0s 3ms/step - loss: 1.5597 - mae: 1.5597 - val_loss: 1.5151 - val_mae: 1.5151 Epoch 4/100 42/42 [==============================] - 0s 3ms/step - loss: 1.5262 - mae: 1.5262 - val_loss: 1.4879 - val_mae: 1.4879 Epoch 5/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4863 - mae: 1.4863 - val_loss: 1.4588 - val_mae: 1.4588 Epoch 6/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4598 - mae: 1.4598 - val_loss: 1.4375 - val_mae: 1.4375 Epoch 7/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4461 - mae: 1.4461 - val_loss: 1.3904 - val_mae: 1.3904 Epoch 8/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4450 - mae: 1.4450 - val_loss: 1.3906 - val_mae: 1.3906 Epoch 9/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4271 - mae: 1.4271 - val_loss: 1.3750 - val_mae: 1.3750 Epoch 10/100 42/42 [==============================] - 0s 2ms/step - loss: 1.4146 - mae: 1.4146 - val_loss: 1.6720 - val_mae: 1.6720 Epoch 11/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3986 - mae: 1.3986 - val_loss: 1.4358 - val_mae: 1.4358 Epoch 12/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3720 - mae: 1.3720 - val_loss: 1.3125 - val_mae: 1.3125 Epoch 13/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3290 - mae: 1.3290 - val_loss: 1.2987 - val_mae: 1.2987 Epoch 14/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3482 - mae: 1.3482 - val_loss: 1.3065 - val_mae: 1.3065 Epoch 15/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3059 - mae: 1.3059 - val_loss: 1.2842 - val_mae: 1.2842 Epoch 16/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3406 - mae: 1.3406 - val_loss: 1.4295 - val_mae: 1.4295 Epoch 17/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3013 - mae: 1.3013 - val_loss: 1.4135 - val_mae: 1.4135 Epoch 18/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3997 - mae: 1.3997 - val_loss: 1.4986 - val_mae: 1.4986 Epoch 19/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3777 - mae: 1.3777 - val_loss: 1.2870 - val_mae: 1.2870 Epoch 20/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3098 - mae: 1.3098 - val_loss: 1.2651 - val_mae: 1.2651 Epoch 21/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2721 - mae: 1.2721 - val_loss: 1.2845 - val_mae: 1.2845 Epoch 22/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3118 - mae: 1.3118 - val_loss: 1.3067 - val_mae: 1.3067 Epoch 23/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2952 - mae: 1.2952 - val_loss: 1.2426 - val_mae: 1.2426 Epoch 24/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2626 - mae: 1.2626 - val_loss: 1.2388 - val_mae: 1.2388 Epoch 25/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2777 - mae: 1.2777 - val_loss: 1.2307 - val_mae: 1.2307 Epoch 26/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2733 - mae: 1.2733 - val_loss: 1.2291 - val_mae: 1.2291 Epoch 27/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2587 - mae: 1.2587 - val_loss: 1.2266 - val_mae: 1.2266 Epoch 28/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2953 - mae: 1.2953 - val_loss: 1.4385 - val_mae: 1.4385 Epoch 29/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3287 - mae: 1.3287 - val_loss: 1.2561 - val_mae: 1.2561 Epoch 30/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2655 - mae: 1.2655 - val_loss: 1.2528 - val_mae: 1.2528 Epoch 31/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2733 - mae: 1.2733 - val_loss: 1.2447 - val_mae: 1.2447 Epoch 32/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2980 - mae: 1.2980 - val_loss: 1.3307 - val_mae: 1.3307 Epoch 33/100 42/42 [==============================] - 0s 3ms/step - loss: 1.3222 - mae: 1.3222 - val_loss: 1.2226 - val_mae: 1.2226 Epoch 34/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2857 - mae: 1.2857 - val_loss: 1.2265 - val_mae: 1.2265 Epoch 35/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2702 - mae: 1.2702 - val_loss: 1.2162 - val_mae: 1.2162 Epoch 36/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2811 - mae: 1.2811 - val_loss: 1.2230 - val_mae: 1.2230 Epoch 37/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2822 - mae: 1.2822 - val_loss: 1.2290 - val_mae: 1.2290 Epoch 38/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2843 - mae: 1.2843 - val_loss: 1.2138 - val_mae: 1.2138 Epoch 39/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2574 - mae: 1.2574 - val_loss: 1.2268 - val_mae: 1.2268 Epoch 40/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2633 - mae: 1.2633 - val_loss: 1.3084 - val_mae: 1.3084 Epoch 41/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2956 - mae: 1.2956 - val_loss: 1.4591 - val_mae: 1.4591 Epoch 42/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2679 - mae: 1.2679 - val_loss: 1.2588 - val_mae: 1.2588 Epoch 43/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2736 - mae: 1.2736 - val_loss: 1.2953 - val_mae: 1.2953 Epoch 44/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2768 - mae: 1.2768 - val_loss: 1.2224 - val_mae: 1.2224 Epoch 45/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2781 - mae: 1.2781 - val_loss: 1.2128 - val_mae: 1.2128 Epoch 46/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2469 - mae: 1.2469 - val_loss: 1.2164 - val_mae: 1.2164 Epoch 47/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2832 - mae: 1.2832 - val_loss: 1.2162 - val_mae: 1.2162 Epoch 48/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2568 - mae: 1.2568 - val_loss: 1.5362 - val_mae: 1.5362 Epoch 49/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2899 - mae: 1.2899 - val_loss: 1.2126 - val_mae: 1.2126 Epoch 50/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2603 - mae: 1.2603 - val_loss: 1.2584 - val_mae: 1.2584 Epoch 51/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2879 - mae: 1.2879 - val_loss: 1.2602 - val_mae: 1.2602 Epoch 52/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2718 - mae: 1.2718 - val_loss: 1.3144 - val_mae: 1.3144 Epoch 53/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2615 - mae: 1.2615 - val_loss: 1.4425 - val_mae: 1.4425 Epoch 54/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2787 - mae: 1.2787 - val_loss: 1.2139 - val_mae: 1.2139 Epoch 55/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2952 - mae: 1.2952 - val_loss: 1.2139 - val_mae: 1.2139 Epoch 56/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2647 - mae: 1.2647 - val_loss: 1.2143 - val_mae: 1.2143 Epoch 57/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2495 - mae: 1.2495 - val_loss: 1.2148 - val_mae: 1.2148 Epoch 58/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2609 - mae: 1.2609 - val_loss: 1.2133 - val_mae: 1.2133 Epoch 59/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2363 - mae: 1.2363 - val_loss: 1.4164 - val_mae: 1.4164 Epoch 60/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2969 - mae: 1.2969 - val_loss: 1.2138 - val_mae: 1.2138 Epoch 61/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2577 - mae: 1.2577 - val_loss: 1.2204 - val_mae: 1.2204 Epoch 62/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2357 - mae: 1.2357 - val_loss: 1.2254 - val_mae: 1.2254 Epoch 63/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2543 - mae: 1.2543 - val_loss: 1.2503 - val_mae: 1.2503 Epoch 64/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2544 - mae: 1.2544 - val_loss: 1.2125 - val_mae: 1.2125 Epoch 65/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2757 - mae: 1.2757 - val_loss: 1.2281 - val_mae: 1.2281 Epoch 66/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2801 - mae: 1.2801 - val_loss: 1.2153 - val_mae: 1.2153 Epoch 67/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2633 - mae: 1.2633 - val_loss: 1.2227 - val_mae: 1.2227 Epoch 68/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2658 - mae: 1.2658 - val_loss: 1.2797 - val_mae: 1.2797 Epoch 69/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2539 - mae: 1.2539 - val_loss: 1.2657 - val_mae: 1.2657 Epoch 70/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2838 - mae: 1.2838 - val_loss: 1.2163 - val_mae: 1.2163 Epoch 71/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2665 - mae: 1.2665 - val_loss: 1.2222 - val_mae: 1.2222 Epoch 72/100 42/42 [==============================] - 0s 4ms/step - loss: 1.2740 - mae: 1.2740 - val_loss: 1.2558 - val_mae: 1.2558 Epoch 73/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2714 - mae: 1.2714 - val_loss: 1.2607 - val_mae: 1.2607 Epoch 74/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2521 - mae: 1.2521 - val_loss: 1.2324 - val_mae: 1.2324 Epoch 75/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2419 - mae: 1.2419 - val_loss: 1.2554 - val_mae: 1.2554 Epoch 76/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2628 - mae: 1.2628 - val_loss: 1.2305 - val_mae: 1.2305 Epoch 77/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2621 - mae: 1.2621 - val_loss: 1.3017 - val_mae: 1.3017 Epoch 78/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2531 - mae: 1.2531 - val_loss: 1.2262 - val_mae: 1.2262 Epoch 79/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2414 - mae: 1.2414 - val_loss: 1.2711 - val_mae: 1.2711 Epoch 80/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2480 - mae: 1.2480 - val_loss: 1.2204 - val_mae: 1.2204 Epoch 81/100 42/42 [==============================] - 0s 4ms/step - loss: 1.2676 - mae: 1.2676 - val_loss: 1.2966 - val_mae: 1.2966 Epoch 82/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2524 - mae: 1.2524 - val_loss: 1.2245 - val_mae: 1.2245 Epoch 83/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2353 - mae: 1.2353 - val_loss: 1.2501 - val_mae: 1.2501 Epoch 84/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2397 - mae: 1.2397 - val_loss: 1.2163 - val_mae: 1.2163 Epoch 85/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2425 - mae: 1.2425 - val_loss: 1.2271 - val_mae: 1.2271 Epoch 86/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2484 - mae: 1.2484 - val_loss: 1.2197 - val_mae: 1.2197 Epoch 87/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2386 - mae: 1.2386 - val_loss: 1.2765 - val_mae: 1.2765 Epoch 88/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2686 - mae: 1.2686 - val_loss: 1.2272 - val_mae: 1.2272 Epoch 89/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2537 - mae: 1.2537 - val_loss: 1.2487 - val_mae: 1.2487 Epoch 90/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2594 - mae: 1.2594 - val_loss: 1.2184 - val_mae: 1.2184 Epoch 91/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2418 - mae: 1.2418 - val_loss: 1.2487 - val_mae: 1.2487 Epoch 92/100 42/42 [==============================] - 0s 2ms/step - loss: 1.3004 - mae: 1.3004 - val_loss: 1.2200 - val_mae: 1.2200 Epoch 93/100 42/42 [==============================] - 0s 3ms/step - loss: 1.2988 - mae: 1.2988 - val_loss: 1.2214 - val_mae: 1.2214 Epoch 94/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2398 - mae: 1.2398 - val_loss: 1.2234 - val_mae: 1.2234 Epoch 95/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2574 - mae: 1.2574 - val_loss: 1.2900 - val_mae: 1.2900 Epoch 96/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2755 - mae: 1.2755 - val_loss: 1.6035 - val_mae: 1.6035 Epoch 97/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2653 - mae: 1.2653 - val_loss: 1.3000 - val_mae: 1.3000 Epoch 98/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2506 - mae: 1.2506 - val_loss: 1.2193 - val_mae: 1.2193 Epoch 99/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2216 - mae: 1.2216 - val_loss: 1.2728 - val_mae: 1.2728 Epoch 100/100 42/42 [==============================] - 0s 2ms/step - loss: 1.2388 - mae: 1.2388 - val_loss: 1.2257 - val_mae: 1.2257
<keras.callbacks.History at 0x1920d03fa00>
# LSTM Neural Network
HORIZON = 1 y
WINDOW_SIZE = 7
windows, labels = create_windows_and_labels(df_3,'meantemp',WINDOW_SIZE,HORIZON)
train_windows, test_windows, train_labels, test_labels = make_train_test_splits(windows, labels)
len(train_windows), len(test_windows), len(train_labels), len(test_labels)
train_windows = np.array(train_windows)
test_windows = np.array(test_windows)
train_labels = np.array(train_labels)
test_labels = np.array(test_labels)
tf.random.set_seed(42)
inputs = layers.Input(shape=(WINDOW_SIZE))
x = layers.Lambda(lambda x: tf.expand_dims(x, axis=1))(inputs)
# x = layers.LSTM(128, activation="relu", return_sequences=True)(x) # this layer will error if the inputs are not the right shape
x = layers.LSTM(128, activation="relu")(x) # using the tanh loss function results in a massive error
# x = layers.Dense(32, activation="relu")(x)
output = layers.Dense(HORIZON)(x)
model_5 = tf.keras.Model(inputs=inputs, outputs=output, name="model_5_lstm")
model_5.compile(loss="mae",
optimizer=tf.keras.optimizers.Adam(),metrics=['mae'])
model_5.fit(train_windows,
train_labels,
epochs=100,
verbose=1,
batch_size=128,
validation_data=(test_windows, test_labels))
Epoch 1/100 10/10 [==============================] - 1s 26ms/step - loss: 20.6324 - mae: 20.6324 - val_loss: 16.2859 - val_mae: 16.2859 Epoch 2/100 10/10 [==============================] - 0s 6ms/step - loss: 11.2881 - mae: 11.2881 - val_loss: 3.9390 - val_mae: 3.9390 Epoch 3/100 10/10 [==============================] - 0s 5ms/step - loss: 3.1607 - mae: 3.1607 - val_loss: 2.9515 - val_mae: 2.9515 Epoch 4/100 10/10 [==============================] - 0s 5ms/step - loss: 2.0314 - mae: 2.0314 - val_loss: 1.9289 - val_mae: 1.9289 Epoch 5/100 10/10 [==============================] - 0s 5ms/step - loss: 1.7277 - mae: 1.7277 - val_loss: 1.7757 - val_mae: 1.7757 Epoch 6/100 10/10 [==============================] - 0s 5ms/step - loss: 1.6262 - mae: 1.6262 - val_loss: 1.5576 - val_mae: 1.5576 Epoch 7/100 10/10 [==============================] - 0s 5ms/step - loss: 1.5545 - mae: 1.5545 - val_loss: 1.5711 - val_mae: 1.5711 Epoch 8/100 10/10 [==============================] - 0s 5ms/step - loss: 1.5283 - mae: 1.5283 - val_loss: 1.5112 - val_mae: 1.5112 Epoch 9/100 10/10 [==============================] - 0s 5ms/step - loss: 1.5145 - mae: 1.5145 - val_loss: 1.4950 - val_mae: 1.4950 Epoch 10/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4940 - mae: 1.4940 - val_loss: 1.4883 - val_mae: 1.4883 Epoch 11/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4840 - mae: 1.4840 - val_loss: 1.4780 - val_mae: 1.4780 Epoch 12/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4821 - mae: 1.4821 - val_loss: 1.4548 - val_mae: 1.4548 Epoch 13/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4824 - mae: 1.4824 - val_loss: 1.5020 - val_mae: 1.5020 Epoch 14/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4521 - mae: 1.4521 - val_loss: 1.4326 - val_mae: 1.4326 Epoch 15/100 10/10 [==============================] - 0s 9ms/step - loss: 1.4380 - mae: 1.4380 - val_loss: 1.4520 - val_mae: 1.4520 Epoch 16/100 10/10 [==============================] - 0s 6ms/step - loss: 1.4386 - mae: 1.4386 - val_loss: 1.4282 - val_mae: 1.4282 Epoch 17/100 10/10 [==============================] - 0s 6ms/step - loss: 1.4256 - mae: 1.4256 - val_loss: 1.4209 - val_mae: 1.4209 Epoch 18/100 10/10 [==============================] - 0s 6ms/step - loss: 1.4178 - mae: 1.4178 - val_loss: 1.3915 - val_mae: 1.3915 Epoch 19/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4075 - mae: 1.4075 - val_loss: 1.3816 - val_mae: 1.3816 Epoch 20/100 10/10 [==============================] - 0s 7ms/step - loss: 1.4019 - mae: 1.4019 - val_loss: 1.4057 - val_mae: 1.4057 Epoch 21/100 10/10 [==============================] - 0s 5ms/step - loss: 1.3982 - mae: 1.3982 - val_loss: 1.3658 - val_mae: 1.3658 Epoch 22/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3848 - mae: 1.3848 - val_loss: 1.3659 - val_mae: 1.3659 Epoch 23/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3857 - mae: 1.3857 - val_loss: 1.3525 - val_mae: 1.3525 Epoch 24/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3769 - mae: 1.3769 - val_loss: 1.3478 - val_mae: 1.3478 Epoch 25/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3604 - mae: 1.3604 - val_loss: 1.3664 - val_mae: 1.3664 Epoch 26/100 10/10 [==============================] - 0s 5ms/step - loss: 1.3780 - mae: 1.3780 - val_loss: 1.3743 - val_mae: 1.3743 Epoch 27/100 10/10 [==============================] - 0s 6ms/step - loss: 1.4139 - mae: 1.4139 - val_loss: 1.3553 - val_mae: 1.3553 Epoch 28/100 10/10 [==============================] - 0s 6ms/step - loss: 1.4012 - mae: 1.4012 - val_loss: 1.4012 - val_mae: 1.4012 Epoch 29/100 10/10 [==============================] - 0s 5ms/step - loss: 1.3795 - mae: 1.3795 - val_loss: 1.4358 - val_mae: 1.4358 Epoch 30/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3758 - mae: 1.3758 - val_loss: 1.3585 - val_mae: 1.3585 Epoch 31/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3495 - mae: 1.3495 - val_loss: 1.3223 - val_mae: 1.3223 Epoch 32/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3319 - mae: 1.3319 - val_loss: 1.3052 - val_mae: 1.3052 Epoch 33/100 10/10 [==============================] - 0s 5ms/step - loss: 1.3389 - mae: 1.3389 - val_loss: 1.3042 - val_mae: 1.3042 Epoch 34/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3226 - mae: 1.3226 - val_loss: 1.3330 - val_mae: 1.3330 Epoch 35/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3564 - mae: 1.3564 - val_loss: 1.2898 - val_mae: 1.2898 Epoch 36/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3212 - mae: 1.3212 - val_loss: 1.3343 - val_mae: 1.3343 Epoch 37/100 10/10 [==============================] - 0s 8ms/step - loss: 1.3281 - mae: 1.3281 - val_loss: 1.2974 - val_mae: 1.2974 Epoch 38/100 10/10 [==============================] - 0s 7ms/step - loss: 1.3100 - mae: 1.3100 - val_loss: 1.3296 - val_mae: 1.3296 Epoch 39/100 10/10 [==============================] - 0s 7ms/step - loss: 1.3127 - mae: 1.3127 - val_loss: 1.2637 - val_mae: 1.2637 Epoch 40/100 10/10 [==============================] - 0s 8ms/step - loss: 1.3040 - mae: 1.3040 - val_loss: 1.2556 - val_mae: 1.2556 Epoch 41/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2964 - mae: 1.2964 - val_loss: 1.3069 - val_mae: 1.3069 Epoch 42/100 10/10 [==============================] - 0s 7ms/step - loss: 1.3345 - mae: 1.3345 - val_loss: 1.3008 - val_mae: 1.3008 Epoch 43/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2923 - mae: 1.2923 - val_loss: 1.2599 - val_mae: 1.2599 Epoch 44/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2777 - mae: 1.2777 - val_loss: 1.2432 - val_mae: 1.2432 Epoch 45/100 10/10 [==============================] - 0s 8ms/step - loss: 1.2783 - mae: 1.2783 - val_loss: 1.2481 - val_mae: 1.2481 Epoch 46/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2737 - mae: 1.2737 - val_loss: 1.2339 - val_mae: 1.2339 Epoch 47/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2653 - mae: 1.2653 - val_loss: 1.2272 - val_mae: 1.2272 Epoch 48/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2669 - mae: 1.2669 - val_loss: 1.2316 - val_mae: 1.2316 Epoch 49/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2974 - mae: 1.2974 - val_loss: 1.3033 - val_mae: 1.3033 Epoch 50/100 10/10 [==============================] - 0s 6ms/step - loss: 1.3067 - mae: 1.3067 - val_loss: 1.2387 - val_mae: 1.2387 Epoch 51/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2584 - mae: 1.2584 - val_loss: 1.2630 - val_mae: 1.2630 Epoch 52/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2698 - mae: 1.2698 - val_loss: 1.2150 - val_mae: 1.2150 Epoch 53/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2700 - mae: 1.2700 - val_loss: 1.2289 - val_mae: 1.2289 Epoch 54/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2740 - mae: 1.2740 - val_loss: 1.2148 - val_mae: 1.2148 Epoch 55/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2549 - mae: 1.2549 - val_loss: 1.2108 - val_mae: 1.2108 Epoch 56/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2527 - mae: 1.2527 - val_loss: 1.2532 - val_mae: 1.2532 Epoch 57/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2583 - mae: 1.2583 - val_loss: 1.2090 - val_mae: 1.2090 Epoch 58/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2450 - mae: 1.2450 - val_loss: 1.2052 - val_mae: 1.2052 Epoch 59/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2412 - mae: 1.2412 - val_loss: 1.2212 - val_mae: 1.2212 Epoch 60/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2512 - mae: 1.2512 - val_loss: 1.2118 - val_mae: 1.2118 Epoch 61/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2434 - mae: 1.2434 - val_loss: 1.2026 - val_mae: 1.2026 Epoch 62/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2574 - mae: 1.2574 - val_loss: 1.2317 - val_mae: 1.2317 Epoch 63/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2615 - mae: 1.2615 - val_loss: 1.2084 - val_mae: 1.2084 Epoch 64/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2548 - mae: 1.2548 - val_loss: 1.2247 - val_mae: 1.2247 Epoch 65/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2422 - mae: 1.2422 - val_loss: 1.2157 - val_mae: 1.2157 Epoch 66/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2500 - mae: 1.2500 - val_loss: 1.2025 - val_mae: 1.2025 Epoch 67/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2368 - mae: 1.2368 - val_loss: 1.2436 - val_mae: 1.2436 Epoch 68/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2532 - mae: 1.2532 - val_loss: 1.2353 - val_mae: 1.2353 Epoch 69/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2557 - mae: 1.2557 - val_loss: 1.2266 - val_mae: 1.2266 Epoch 70/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2452 - mae: 1.2452 - val_loss: 1.2044 - val_mae: 1.2044 Epoch 71/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2389 - mae: 1.2389 - val_loss: 1.2031 - val_mae: 1.2031 Epoch 72/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2390 - mae: 1.2390 - val_loss: 1.2307 - val_mae: 1.2307 Epoch 73/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2479 - mae: 1.2479 - val_loss: 1.2009 - val_mae: 1.2009 Epoch 74/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2377 - mae: 1.2377 - val_loss: 1.2133 - val_mae: 1.2133 Epoch 75/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2333 - mae: 1.2333 - val_loss: 1.2698 - val_mae: 1.2698 Epoch 76/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2820 - mae: 1.2820 - val_loss: 1.2098 - val_mae: 1.2098 Epoch 77/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2481 - mae: 1.2481 - val_loss: 1.2116 - val_mae: 1.2116 Epoch 78/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2341 - mae: 1.2341 - val_loss: 1.1981 - val_mae: 1.1981 Epoch 79/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2346 - mae: 1.2346 - val_loss: 1.2011 - val_mae: 1.2011 Epoch 80/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2373 - mae: 1.2373 - val_loss: 1.2693 - val_mae: 1.2693 Epoch 81/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2649 - mae: 1.2649 - val_loss: 1.2238 - val_mae: 1.2238 Epoch 82/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2569 - mae: 1.2569 - val_loss: 1.2008 - val_mae: 1.2008 Epoch 83/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2396 - mae: 1.2396 - val_loss: 1.2121 - val_mae: 1.2121 Epoch 84/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2326 - mae: 1.2326 - val_loss: 1.2037 - val_mae: 1.2037 Epoch 85/100 10/10 [==============================] - 0s 11ms/step - loss: 1.2269 - mae: 1.2269 - val_loss: 1.2032 - val_mae: 1.2032 Epoch 86/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2255 - mae: 1.2255 - val_loss: 1.2017 - val_mae: 1.2017 Epoch 87/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2210 - mae: 1.2210 - val_loss: 1.2385 - val_mae: 1.2385 Epoch 88/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2455 - mae: 1.2455 - val_loss: 1.2165 - val_mae: 1.2165 Epoch 89/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2285 - mae: 1.2285 - val_loss: 1.2768 - val_mae: 1.2768 Epoch 90/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2424 - mae: 1.2424 - val_loss: 1.2022 - val_mae: 1.2022 Epoch 91/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2374 - mae: 1.2374 - val_loss: 1.2383 - val_mae: 1.2383 Epoch 92/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2626 - mae: 1.2626 - val_loss: 1.2162 - val_mae: 1.2162 Epoch 93/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2537 - mae: 1.2537 - val_loss: 1.2769 - val_mae: 1.2769 Epoch 94/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2357 - mae: 1.2357 - val_loss: 1.2119 - val_mae: 1.2119 Epoch 95/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2235 - mae: 1.2235 - val_loss: 1.2090 - val_mae: 1.2090 Epoch 96/100 10/10 [==============================] - 0s 6ms/step - loss: 1.2246 - mae: 1.2246 - val_loss: 1.2020 - val_mae: 1.2020 Epoch 97/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2225 - mae: 1.2225 - val_loss: 1.2044 - val_mae: 1.2044 Epoch 98/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2397 - mae: 1.2397 - val_loss: 1.2113 - val_mae: 1.2113 Epoch 99/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2291 - mae: 1.2291 - val_loss: 1.4529 - val_mae: 1.4529 Epoch 100/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2648 - mae: 1.2648 - val_loss: 1.2111 - val_mae: 1.2111
<keras.callbacks.History at 0x192006eae20>
def make_preds(model, input_data):
"""
Uses model to make predictions on input_data.
Parameters
----------
model: trained model
input_data: windowed input data (same kind of data model was trained on)
Returns model predictions on input_data.
"""
forecast = model.predict(input_data)
return tf.squeeze(forecast) # return 1D array of predictions
def evaluate_preds(y_true, y_pred):
# Make sure float32 (for metric calculations)
y_true = tf.cast(y_true, dtype=tf.float32)
y_pred = tf.cast(y_pred, dtype=tf.float32)
# Calculate various metrics
mae = tf.keras.metrics.mean_absolute_error(y_true, y_pred)
mse = tf.keras.metrics.mean_squared_error(y_true, y_pred)
rmse = tf.sqrt(mse)
mape = tf.keras.metrics.mean_absolute_percentage_error(y_true, y_pred)
mase = mean_absolute_scaled_error(y_true, y_pred)
# Account for different sized metrics (for longer horizons, reduce to single number)
if mae.ndim > 0: # if mae isn't already a scalar, reduce it to one by aggregating tensors to mean
mae = tf.reduce_mean(mae)
mse = tf.reduce_mean(mse)
rmse = tf.reduce_mean(rmse)
mape = tf.reduce_mean(mape)
mase = tf.reduce_mean(mase)
return {"mae": mae.numpy(),
"mse": mse.numpy(),
"rmse": rmse.numpy(),
"mape": mape.numpy(),
"mase": mase.numpy()}
model_1_preds = make_preds(model_1, test_windows)
model_1_results = evaluate_preds(y_true=tf.squeeze(test_labels), # reduce to right shape
y_pred=model_1_preds)
model_1_results
10/10 [==============================] - 0s 889us/step
{'mae': 1.2768153, 'mse': 2.7734833, 'rmse': 1.6653779, 'mape': 5.5863523, 'mase': 1.0357081}
model_2_preds = make_preds(model_2, test_windows)
model_2_results = evaluate_preds(y_true=tf.squeeze(test_labels), # reduce to right shape
y_pred=model_2_preds)
model_2_results
10/10 [==============================] - 0s 1ms/step
{'mae': 1.4096184, 'mse': 3.087791, 'rmse': 1.7572111, 'mape': 5.848076, 'mase': 1.1505425}
model_3_preds = make_preds(model_3, test_windows)
model_3_results = evaluate_preds(y_true=tf.squeeze(test_labels), # reduce to right shape
y_pred=model_3_preds)
model_3_results
10/10 [==============================] - 0s 1000us/step
{'mae': 2.0381174, 'mse': 6.5826836, 'rmse': 2.3279927, 'mape': 8.4692, 'mase': 1.6481316}
model_4_preds = make_preds(model_4, test_windows)
model_4_results = evaluate_preds(y_true=tf.squeeze(test_labels), # reduce to right shape
y_pred=model_4_preds)
model_4_results
10/10 [==============================] - 0s 1000us/step
{'mae': 1.2256584, 'mse': 2.5369978, 'rmse': 1.5927956, 'mape': 5.294838, 'mase': 0.99421144}
model_5_preds = make_preds(model_5, test_windows)
model_5_results = evaluate_preds(y_true=tf.squeeze(test_labels), # reduce to right shape
y_pred=model_5_preds)
model_5_results
10/10 [==============================] - 0s 2ms/step
{'mae': 1.2110624, 'mse': 2.4748514, 'rmse': 1.573166, 'mape': 5.254684, 'mase': 0.9823717}
# model_1 plot
offset = 300
plt.figure(figsize=(10, 7))
# Account for the test_window offset and index into test_labels to ensure correct plotting
plot_time_series(timesteps=X_test[-len(test_windows):], values=test_labels[:, 0], start=offset, label="Test_data")
plot_time_series(timesteps=X_test[-len(test_windows):], values=model_1_preds, start=offset, format="-", label="model_1_preds")
# model_2 plot
offset = 300
plt.figure(figsize=(10, 7))
# Account for the test_window offset
plot_time_series(timesteps=X_test[-len(test_windows):], values=test_labels[:, 0], start=offset, label="test_data")
plot_time_series(timesteps=X_test[-len(test_windows):], values=model_2_preds, start=offset, format="-", label="model_2_preds")
# model_3 plot
offset = 300
plt.figure(figsize=(10, 7))
# Plot model_3_preds by aggregating them (note: this condenses information so the preds will look fruther ahead than the test data)
plot_time_series(timesteps=X_test[-len(test_windows):],
values=test_labels[:, 0],
start=offset,
label="Test_data")
plot_time_series(timesteps=X_test[-len(test_windows):],
values=tf.reduce_mean(model_3_preds, axis=1),
format="-",
start=offset,
label="model_3_preds")
# model_4 plot
offset = 300
plt.figure(figsize=(10, 7))
# Account for the test_window offset
plot_time_series(timesteps=X_test[-len(test_windows):], values=test_labels[:, 0], start=offset, label="test_data")
plot_time_series(timesteps=X_test[-len(test_windows):], values=model_4_preds, start=offset, format="-", label="model_4_preds")
# model_5 plot
offset = 300
plt.figure(figsize=(10, 7))
# Account for the test_window offset
plot_time_series(timesteps=X_test[-len(test_windows):], values=test_labels[:, 0], start=offset, label="test_data")
plot_time_series(timesteps=X_test[-len(test_windows):], values=model_5_preds, start=offset, format="-", label="model_5_preds")
# Create a function that can make window from 2 columns for Multivariate time series model
def create_windows_and_labels(df, col_1, col_2, window_size, horizon):
windows = []
labels = []
for i in range(len(df) - window_size - horizon + 1):
window = df[[col_1, col_2]].iloc[i:i+window_size].values
label = df[col_1].iloc[i+window_size:i+window_size+horizon].values
windows.append(window)
labels.append(label)
windows = np.array(windows)
windows = windows.reshape(windows.shape[0], -1)
labels = np.array(labels)
return windows, labels
# Create multivariate window
HORIZON = 1
WINDOW_SIZE = 7
windows, labels = create_windows_and_labels(df_3,'meantemp','humidity',WINDOW_SIZE,HORIZON)
train_windows, test_windows, train_labels, test_labels = make_train_test_splits(windows, labels)
len(train_windows), len(test_windows), len(train_labels), len(test_labels)
(1254, 314, 1254, 314)
# Check if the sliding window is correctly made on the temperature and humidity column
df_3.head(10)
level_0 | index | date | meantemp | humidity | wind_speed | meanpressure | |
---|---|---|---|---|---|---|---|
0 | 0 | 0 | 2013-01-01 | 10.000000 | 84.500000 | 0.000000 | 1015.666667 |
1 | 1 | 1 | 2013-01-02 | 7.400000 | 92.000000 | 2.980000 | 1017.800000 |
2 | 2 | 2 | 2013-01-03 | 7.166667 | 87.000000 | 4.633333 | 1018.666667 |
3 | 3 | 3 | 2013-01-04 | 8.666667 | 71.333333 | 1.233333 | 1017.166667 |
4 | 4 | 4 | 2013-01-05 | 6.000000 | 86.833333 | 3.700000 | 1016.500000 |
5 | 5 | 5 | 2013-01-06 | 7.000000 | 82.800000 | 1.480000 | 1018.000000 |
6 | 6 | 6 | 2013-01-07 | 7.000000 | 78.600000 | 6.300000 | 1020.000000 |
7 | 7 | 7 | 2013-01-08 | 8.857143 | 63.714286 | 7.142857 | 1018.714286 |
8 | 8 | 8 | 2013-01-09 | 14.000000 | 51.250000 | 12.500000 | 1017.000000 |
9 | 9 | 9 | 2013-01-10 | 11.000000 | 62.000000 | 7.400000 | 1015.666667 |
#Check train set
for i in range(2):
print(train_windows[i])
print(train_labels[i])
for i in range(-3,-1):
print(train_windows[i])
print(train_labels[i])
[10. 84.5 7.4 92. 7.16666667 87. 8.66666667 71.33333333 6. 86.83333333 7. 82.8 7. 78.6 ] [8.85714286] [ 7.4 92. 7.16666667 87. 8.66666667 71.33333333 6. 86.83333333 7. 82.8 7. 78.6 8.85714286 63.71428571] [14.] [36.16666667 51.75 35.42857143 45.71428571 34.625 59.1875 36.07142857 44.64285714 35.73333333 43.73333333 36.13333333 41.86666667 33.4375 49.9375 ] [35.5] [35.42857143 45.71428571 34.625 59.1875 36.07142857 44.64285714 35.73333333 43.73333333 36.13333333 41.86666667 33.4375 49.9375 35.5 37.125 ] [36.]
#Check train set
for i in range(2):
print(test_windows[i])
print(test_labels[i])
for i in range(-3,-1):
print(test_windows[i])
print(test_labels[i])
[36.07142857 44.64285714 35.73333333 43.73333333 36.13333333 41.86666667 33.4375 49.9375 35.5 37.125 36. 43.3125 32.625 55.125 ] [34.73333333] [35.73333333 43.73333333 36.13333333 41.86666667 33.4375 49.9375 35.5 37.125 36. 43.3125 32.625 55.125 34.73333333 48.86666667] [33.5] [31.22222222 30.44444444 31. 34.25 32.55555556 38.44444444 34. 27.33333333 33.5 24.125 34.5 27.5 34.25 39.375 ] [32.9] [31. 34.25 32.55555556 38.44444444 34. 27.33333333 33.5 24.125 34.5 27.5 34.25 39.375 32.9 40.9 ] [32.875]
# Dense model with multivariate window
tf.random.set_seed(42)
model_6 = tf.keras.Sequential([
layers.Dense(128, activation="relu"),
# layers.Dense(128, activation="relu"),
layers.Dense(HORIZON)
], name="model_6_dense_multivariate")
model_6.compile(loss="mae",
optimizer=tf.keras.optimizers.Adam(),metrics=['mae'])
# Fit
model_6.fit(train_windows, train_labels,
epochs=100,
batch_size=128,
verbose=1, # only print 1 line per epoch
validation_data=(test_windows, test_labels))
Epoch 1/100 10/10 [==============================] - 0s 14ms/step - loss: 6.2210 - mae: 6.2210 - val_loss: 2.7103 - val_mae: 2.7103 Epoch 2/100 10/10 [==============================] - 0s 4ms/step - loss: 2.4802 - mae: 2.4802 - val_loss: 2.4665 - val_mae: 2.4665 Epoch 3/100 10/10 [==============================] - 0s 4ms/step - loss: 2.0889 - mae: 2.0889 - val_loss: 1.9522 - val_mae: 1.9522 Epoch 4/100 10/10 [==============================] - 0s 4ms/step - loss: 1.7898 - mae: 1.7898 - val_loss: 1.7628 - val_mae: 1.7628 Epoch 5/100 10/10 [==============================] - 0s 4ms/step - loss: 1.6866 - mae: 1.6866 - val_loss: 1.8667 - val_mae: 1.8667 Epoch 6/100 10/10 [==============================] - 0s 4ms/step - loss: 1.6614 - mae: 1.6614 - val_loss: 1.6446 - val_mae: 1.6446 Epoch 7/100 10/10 [==============================] - 0s 4ms/step - loss: 1.6141 - mae: 1.6141 - val_loss: 1.6903 - val_mae: 1.6903 Epoch 8/100 10/10 [==============================] - 0s 4ms/step - loss: 1.5458 - mae: 1.5458 - val_loss: 1.5941 - val_mae: 1.5941 Epoch 9/100 10/10 [==============================] - 0s 4ms/step - loss: 1.5433 - mae: 1.5433 - val_loss: 1.5653 - val_mae: 1.5653 Epoch 10/100 10/10 [==============================] - 0s 4ms/step - loss: 1.5255 - mae: 1.5255 - val_loss: 1.5687 - val_mae: 1.5687 Epoch 11/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4906 - mae: 1.4906 - val_loss: 1.5352 - val_mae: 1.5352 Epoch 12/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4821 - mae: 1.4821 - val_loss: 1.5445 - val_mae: 1.5445 Epoch 13/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4949 - mae: 1.4949 - val_loss: 1.5671 - val_mae: 1.5671 Epoch 14/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4852 - mae: 1.4852 - val_loss: 1.5245 - val_mae: 1.5245 Epoch 15/100 10/10 [==============================] - 0s 4ms/step - loss: 1.6171 - mae: 1.6171 - val_loss: 1.6800 - val_mae: 1.6800 Epoch 16/100 10/10 [==============================] - 0s 4ms/step - loss: 1.5360 - mae: 1.5360 - val_loss: 1.5206 - val_mae: 1.5206 Epoch 17/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4366 - mae: 1.4366 - val_loss: 1.4719 - val_mae: 1.4719 Epoch 18/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4703 - mae: 1.4703 - val_loss: 1.4966 - val_mae: 1.4966 Epoch 19/100 10/10 [==============================] - 0s 6ms/step - loss: 1.4238 - mae: 1.4238 - val_loss: 1.6631 - val_mae: 1.6631 Epoch 20/100 10/10 [==============================] - 0s 5ms/step - loss: 1.5110 - mae: 1.5110 - val_loss: 1.6137 - val_mae: 1.6137 Epoch 21/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4874 - mae: 1.4874 - val_loss: 1.5860 - val_mae: 1.5860 Epoch 22/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4896 - mae: 1.4896 - val_loss: 1.4369 - val_mae: 1.4369 Epoch 23/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4387 - mae: 1.4387 - val_loss: 1.6772 - val_mae: 1.6772 Epoch 24/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4010 - mae: 1.4010 - val_loss: 1.4042 - val_mae: 1.4042 Epoch 25/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3566 - mae: 1.3566 - val_loss: 1.3967 - val_mae: 1.3967 Epoch 26/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3573 - mae: 1.3573 - val_loss: 1.4181 - val_mae: 1.4181 Epoch 27/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3409 - mae: 1.3409 - val_loss: 1.3928 - val_mae: 1.3928 Epoch 28/100 10/10 [==============================] - 0s 7ms/step - loss: 1.3711 - mae: 1.3711 - val_loss: 1.3844 - val_mae: 1.3844 Epoch 29/100 10/10 [==============================] - 0s 3ms/step - loss: 1.3762 - mae: 1.3762 - val_loss: 1.4211 - val_mae: 1.4211 Epoch 30/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3702 - mae: 1.3702 - val_loss: 1.5291 - val_mae: 1.5291 Epoch 31/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3905 - mae: 1.3905 - val_loss: 1.3868 - val_mae: 1.3868 Epoch 32/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3358 - mae: 1.3358 - val_loss: 1.4644 - val_mae: 1.4644 Epoch 33/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3779 - mae: 1.3779 - val_loss: 1.4748 - val_mae: 1.4748 Epoch 34/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4008 - mae: 1.4008 - val_loss: 1.3971 - val_mae: 1.3971 Epoch 35/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3839 - mae: 1.3839 - val_loss: 1.5099 - val_mae: 1.5099 Epoch 36/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3800 - mae: 1.3800 - val_loss: 1.4323 - val_mae: 1.4323 Epoch 37/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3975 - mae: 1.3975 - val_loss: 1.3524 - val_mae: 1.3524 Epoch 38/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3189 - mae: 1.3189 - val_loss: 1.3880 - val_mae: 1.3880 Epoch 39/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3465 - mae: 1.3465 - val_loss: 1.3319 - val_mae: 1.3319 Epoch 40/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3223 - mae: 1.3223 - val_loss: 1.5556 - val_mae: 1.5556 Epoch 41/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3405 - mae: 1.3405 - val_loss: 1.3387 - val_mae: 1.3387 Epoch 42/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3124 - mae: 1.3124 - val_loss: 1.3502 - val_mae: 1.3502 Epoch 43/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3363 - mae: 1.3363 - val_loss: 1.4642 - val_mae: 1.4642 Epoch 44/100 10/10 [==============================] - 0s 5ms/step - loss: 1.4018 - mae: 1.4018 - val_loss: 1.3323 - val_mae: 1.3323 Epoch 45/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2955 - mae: 1.2955 - val_loss: 1.3803 - val_mae: 1.3803 Epoch 46/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2879 - mae: 1.2879 - val_loss: 1.3190 - val_mae: 1.3190 Epoch 47/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2850 - mae: 1.2850 - val_loss: 1.3171 - val_mae: 1.3171 Epoch 48/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2972 - mae: 1.2972 - val_loss: 1.3170 - val_mae: 1.3170 Epoch 49/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2784 - mae: 1.2784 - val_loss: 1.3283 - val_mae: 1.3283 Epoch 50/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2927 - mae: 1.2927 - val_loss: 1.5019 - val_mae: 1.5019 Epoch 51/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3238 - mae: 1.3238 - val_loss: 1.4344 - val_mae: 1.4344 Epoch 52/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3028 - mae: 1.3028 - val_loss: 1.3061 - val_mae: 1.3061 Epoch 53/100 10/10 [==============================] - 0s 9ms/step - loss: 1.2855 - mae: 1.2855 - val_loss: 1.4093 - val_mae: 1.4093 Epoch 54/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2830 - mae: 1.2830 - val_loss: 1.3384 - val_mae: 1.3384 Epoch 55/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3247 - mae: 1.3247 - val_loss: 1.3135 - val_mae: 1.3135 Epoch 56/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2720 - mae: 1.2720 - val_loss: 1.3414 - val_mae: 1.3414 Epoch 57/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2848 - mae: 1.2848 - val_loss: 1.3034 - val_mae: 1.3034 Epoch 58/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2747 - mae: 1.2747 - val_loss: 1.3198 - val_mae: 1.3198 Epoch 59/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2785 - mae: 1.2785 - val_loss: 1.3173 - val_mae: 1.3173 Epoch 60/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3129 - mae: 1.3129 - val_loss: 1.4055 - val_mae: 1.4055 Epoch 61/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2802 - mae: 1.2802 - val_loss: 1.3135 - val_mae: 1.3135 Epoch 62/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2767 - mae: 1.2767 - val_loss: 1.3608 - val_mae: 1.3608 Epoch 63/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2502 - mae: 1.2502 - val_loss: 1.3531 - val_mae: 1.3531 Epoch 64/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2543 - mae: 1.2543 - val_loss: 1.3060 - val_mae: 1.3060 Epoch 65/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2625 - mae: 1.2625 - val_loss: 1.3087 - val_mae: 1.3087 Epoch 66/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2423 - mae: 1.2423 - val_loss: 1.3023 - val_mae: 1.3023 Epoch 67/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2560 - mae: 1.2560 - val_loss: 1.4485 - val_mae: 1.4485 Epoch 68/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2809 - mae: 1.2809 - val_loss: 1.2917 - val_mae: 1.2917 Epoch 69/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2751 - mae: 1.2751 - val_loss: 1.2919 - val_mae: 1.2919 Epoch 70/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2918 - mae: 1.2918 - val_loss: 1.3696 - val_mae: 1.3696 Epoch 71/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2516 - mae: 1.2516 - val_loss: 1.3151 - val_mae: 1.3151 Epoch 72/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2445 - mae: 1.2445 - val_loss: 1.2962 - val_mae: 1.2962 Epoch 73/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2808 - mae: 1.2808 - val_loss: 1.4240 - val_mae: 1.4240 Epoch 74/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2915 - mae: 1.2915 - val_loss: 1.3258 - val_mae: 1.3258 Epoch 75/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2878 - mae: 1.2878 - val_loss: 1.3400 - val_mae: 1.3400 Epoch 76/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2452 - mae: 1.2452 - val_loss: 1.2929 - val_mae: 1.2929 Epoch 77/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2510 - mae: 1.2510 - val_loss: 1.2983 - val_mae: 1.2983 Epoch 78/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2787 - mae: 1.2787 - val_loss: 1.2822 - val_mae: 1.2822 Epoch 79/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2707 - mae: 1.2707 - val_loss: 1.2775 - val_mae: 1.2775 Epoch 80/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3091 - mae: 1.3091 - val_loss: 1.2858 - val_mae: 1.2858 Epoch 81/100 10/10 [==============================] - 0s 5ms/step - loss: 1.3066 - mae: 1.3066 - val_loss: 1.2799 - val_mae: 1.2799 Epoch 82/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2533 - mae: 1.2533 - val_loss: 1.2835 - val_mae: 1.2835 Epoch 83/100 10/10 [==============================] - 0s 7ms/step - loss: 1.2423 - mae: 1.2423 - val_loss: 1.2765 - val_mae: 1.2765 Epoch 84/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2545 - mae: 1.2545 - val_loss: 1.2854 - val_mae: 1.2854 Epoch 85/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2598 - mae: 1.2598 - val_loss: 1.2732 - val_mae: 1.2732 Epoch 86/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2179 - mae: 1.2179 - val_loss: 1.2819 - val_mae: 1.2819 Epoch 87/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2220 - mae: 1.2220 - val_loss: 1.2726 - val_mae: 1.2726 Epoch 88/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2328 - mae: 1.2328 - val_loss: 1.2752 - val_mae: 1.2752 Epoch 89/100 10/10 [==============================] - 0s 5ms/step - loss: 1.2284 - mae: 1.2284 - val_loss: 1.2838 - val_mae: 1.2838 Epoch 90/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2536 - mae: 1.2536 - val_loss: 1.2864 - val_mae: 1.2864 Epoch 91/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3102 - mae: 1.3102 - val_loss: 1.3536 - val_mae: 1.3536 Epoch 92/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2794 - mae: 1.2794 - val_loss: 1.3437 - val_mae: 1.3437 Epoch 93/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2901 - mae: 1.2901 - val_loss: 1.2938 - val_mae: 1.2938 Epoch 94/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3974 - mae: 1.3974 - val_loss: 1.5678 - val_mae: 1.5678 Epoch 95/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3342 - mae: 1.3342 - val_loss: 1.3503 - val_mae: 1.3503 Epoch 96/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3030 - mae: 1.3030 - val_loss: 1.5872 - val_mae: 1.5872 Epoch 97/100 10/10 [==============================] - 0s 4ms/step - loss: 1.4768 - mae: 1.4768 - val_loss: 1.2707 - val_mae: 1.2707 Epoch 98/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2703 - mae: 1.2703 - val_loss: 1.3328 - val_mae: 1.3328 Epoch 99/100 10/10 [==============================] - 0s 4ms/step - loss: 1.2326 - mae: 1.2326 - val_loss: 1.2716 - val_mae: 1.2716 Epoch 100/100 10/10 [==============================] - 0s 4ms/step - loss: 1.3028 - mae: 1.3028 - val_loss: 1.3972 - val_mae: 1.3972
<keras.callbacks.History at 0x192119dfe80>
model_6_preds = tf.squeeze(model_6.predict(test_windows))
model_6_results = evaluate_preds(y_true=test_labels,
y_pred=model_6_preds)
model_6_results
model_6_preds[:10]
10/10 [==============================] - 0s 889us/step
<tf.Tensor: shape=(10,), dtype=float32, numpy= array([33.647312, 35.02929 , 34.951817, 34.530388, 35.363094, 35.817043, 32.451668, 33.206333, 34.080868, 33.935658], dtype=float32)>
# model_6 plot
offset = 300
plt.figure(figsize=(10, 7))
# Account for the test_window offset
plot_time_series(timesteps=X_test[-len(test_windows):], values=test_labels[:, 0], start=offset, label="test_data")
plot_time_series(timesteps=X_test[-len(test_windows):], values=model_6_preds, start=offset, format="-", label="model_6_preds")
# Another easier way to create the window is to use the tf.keras.preprocessing.timeseries_dataset_from_array function
# to easily make the window
window_size = 7
horizon = 1
data = df_3['meantemp'].values
dataset = tf.keras.preprocessing.timeseries_dataset_from_array(
data=data,
targets=None,
sequence_length=window_size + horizon,
sequence_stride=1,
batch_size=1)
windows = []
labels = []
for window in dataset:
windows.append(window[:, :-horizon])
labels.append(window[:, -horizon:])
windows = np.array(windows)
labels = np.array(labels)
# Check window and labels
for i in range(2):
print(windows[i])
print(labels[i])
for i in range(-3,-1):
print(windows[i])
print(labels[i])
[[10. 7.4 7.16666667 8.66666667 6. 7. 7. ]] [[8.85714286]] [[7.4 7.16666667 8.66666667 6. 7. 7. 8.85714286]] [[14.]] [[31.22222222 31. 32.55555556 34. 33.5 34.5 34.25 ]] [[32.9]] [[31. 32.55555556 34. 33.5 34.5 34.25 32.9 ]] [[32.875]]
# For multivariate, change it to this
window_size = 7
horizon = 1
data = df_3[['meantemp', 'humidity']].values
dataset = tf.keras.preprocessing.timeseries_dataset_from_array(
data=data,
targets=None,
sequence_length=window_size + horizon,
sequence_stride=1,
batch_size=1
)
windows = []
labels = []
for window in dataset:
windows.append(window[:, :-horizon, :2]) # include both columns in input windows
labels.append(window[:, -horizon:, 0]) # only include first column in output labels
windows = np.array(windows)
labels = np.array(labels)
#Check Window and label
for i in range(2):
print(windows[i])
print(labels[i])
for i in range(-3,-1):
print(windows[i])
print(labels[i])
[[[10. 84.5 ] [ 7.4 92. ] [ 7.16666667 87. ] [ 8.66666667 71.33333333] [ 6. 86.83333333] [ 7. 82.8 ] [ 7. 78.6 ]]] [[8.85714286]] [[[ 7.4 92. ] [ 7.16666667 87. ] [ 8.66666667 71.33333333] [ 6. 86.83333333] [ 7. 82.8 ] [ 7. 78.6 ] [ 8.85714286 63.71428571]]] [[14.]] [[[31.22222222 30.44444444] [31. 34.25 ] [32.55555556 38.44444444] [34. 27.33333333] [33.5 24.125 ] [34.5 27.5 ] [34.25 39.375 ]]] [[32.9]] [[[31. 34.25 ] [32.55555556 38.44444444] [34. 27.33333333] [33.5 24.125 ] [34.5 27.5 ] [34.25 39.375 ] [32.9 40.9 ]]] [[32.875]]
# Create function to make predictions into the next time step, then append it to the last window to predict again
# and repeat this
def make_future_forecast(values, model, into_future, window_size=WINDOW_SIZE) -> list:
"""
Makes future forecasts into_future steps after values ends.
Returns future forecasts as list of floats.
"""
# 2. Make an empty list for future forecasts/prepare data to forecast on
future_forecast = []
last_window = values[-WINDOW_SIZE:] # only want preds from the last window (this will get updated)
# 3. Make INTO_FUTURE number of predictions, altering the data which gets predicted on each time
for _ in range(into_future):
# Predict on last window then append it again, again, again (model starts to make forecasts on its own forecasts)
future_pred = model.predict(tf.expand_dims(last_window, axis=0))
print(f"Predicting on: \n {last_window} -> Prediction: {tf.squeeze(future_pred).numpy()}\n")
# Append predictions to future_forecast
future_forecast.append(tf.squeeze(future_pred).numpy())
# print(future_forecast)
# Update last window with new pred and get WINDOW_SIZE most recent preds (model was trained on WINDOW_SIZE windows)
last_window = np.append(last_window, future_pred)[-WINDOW_SIZE:]
return future_forecast
# Make Predictions
INTO_FUTURE = 50
WINDOW_SIZE = 7
future_forecast = make_future_forecast(values=df_3['meantemp'],
model=model_5,
into_future=INTO_FUTURE,
window_size=WINDOW_SIZE)
1/1 [==============================] - 0s 20ms/step Predicting on: 1568 34.000 1569 33.500 1570 34.500 1571 34.250 1572 32.900 1573 32.875 1574 32.000 Name: meantemp, dtype: float64 -> Prediction: 32.330177307128906 1/1 [==============================] - 0s 16ms/step Predicting on: [33.5 34.5 34.25 32.9 32.875 32. 32.33017731] -> Prediction: 32.4648551940918 1/1 [==============================] - 0s 16ms/step Predicting on: [34.5 34.25 32.9 32.875 32. 32.33017731 32.46485519] -> Prediction: 32.627403259277344 1/1 [==============================] - 0s 16ms/step Predicting on: [34.25 32.9 32.875 32. 32.33017731 32.46485519 32.62740326] -> Prediction: 32.515689849853516 1/1 [==============================] - 0s 17ms/step Predicting on: [32.9 32.875 32. 32.33017731 32.46485519 32.62740326 32.51568985] -> Prediction: 32.412269592285156 1/1 [==============================] - 0s 16ms/step Predicting on: [32.875 32. 32.33017731 32.46485519 32.62740326 32.51568985 32.41226959] -> Prediction: 32.25432205200195 1/1 [==============================] - 0s 18ms/step Predicting on: [32. 32.33017731 32.46485519 32.62740326 32.51568985 32.41226959 32.25432205] -> Prediction: 32.15689468383789 1/1 [==============================] - 0s 17ms/step Predicting on: [32.33017731 32.46485519 32.62740326 32.51568985 32.41226959 32.25432205 32.15689468] -> Prediction: 32.09272384643555 1/1 [==============================] - 0s 17ms/step Predicting on: [32.46485519 32.62740326 32.51568985 32.41226959 32.25432205 32.15689468 32.09272385] -> Prediction: 32.05591583251953 1/1 [==============================] - 0s 18ms/step Predicting on: [32.62740326 32.51568985 32.41226959 32.25432205 32.15689468 32.09272385 32.05591583] -> Prediction: 32.008750915527344 1/1 [==============================] - 0s 16ms/step Predicting on: [32.51568985 32.41226959 32.25432205 32.15689468 32.09272385 32.05591583 32.00875092] -> Prediction: 31.951112747192383 1/1 [==============================] - 0s 16ms/step Predicting on: [32.41226959 32.25432205 32.15689468 32.09272385 32.05591583 32.00875092 31.95111275] -> Prediction: 31.88245391845703 1/1 [==============================] - 0s 16ms/step Predicting on: [32.25432205 32.15689468 32.09272385 32.05591583 32.00875092 31.95111275 31.88245392] -> Prediction: 31.81241226196289 1/1 [==============================] - 0s 18ms/step Predicting on: [32.15689468 32.09272385 32.05591583 32.00875092 31.95111275 31.88245392 31.81241226] -> Prediction: 31.745281219482422 1/1 [==============================] - 0s 18ms/step Predicting on: [32.09272385 32.05591583 32.00875092 31.95111275 31.88245392 31.81241226 31.74528122] -> Prediction: 31.683237075805664 1/1 [==============================] - 0s 18ms/step Predicting on: [32.05591583 32.00875092 31.95111275 31.88245392 31.81241226 31.74528122 31.68323708] -> Prediction: 31.6240177154541 1/1 [==============================] - 0s 18ms/step Predicting on: [32.00875092 31.95111275 31.88245392 31.81241226 31.74528122 31.68323708 31.62401772] -> Prediction: 31.565595626831055 1/1 [==============================] - 0s 18ms/step Predicting on: [31.95111275 31.88245392 31.81241226 31.74528122 31.68323708 31.62401772 31.56559563] -> Prediction: 31.506608963012695 1/1 [==============================] - 0s 18ms/step Predicting on: [31.88245392 31.81241226 31.74528122 31.68323708 31.62401772 31.56559563 31.50660896] -> Prediction: 31.44715118408203 1/1 [==============================] - 0s 17ms/step Predicting on: [31.81241226 31.74528122 31.68323708 31.62401772 31.56559563 31.50660896 31.44715118] -> Prediction: 31.387779235839844 1/1 [==============================] - 0s 17ms/step Predicting on: [31.74528122 31.68323708 31.62401772 31.56559563 31.50660896 31.44715118 31.38777924] -> Prediction: 31.329113006591797 1/1 [==============================] - 0s 19ms/step Predicting on: [31.68323708 31.62401772 31.56559563 31.50660896 31.44715118 31.38777924 31.32911301] -> Prediction: 31.2713623046875 1/1 [==============================] - 0s 18ms/step Predicting on: [31.62401772 31.56559563 31.50660896 31.44715118 31.38777924 31.32911301 31.2713623 ] -> Prediction: 31.214439392089844 1/1 [==============================] - 0s 18ms/step Predicting on: [31.56559563 31.50660896 31.44715118 31.38777924 31.32911301 31.2713623 31.21443939] -> Prediction: 31.15813446044922 1/1 [==============================] - 0s 18ms/step Predicting on: [31.50660896 31.44715118 31.38777924 31.32911301 31.2713623 31.21443939 31.15813446] -> Prediction: 31.102306365966797 1/1 [==============================] - 0s 16ms/step Predicting on: [31.44715118 31.38777924 31.32911301 31.2713623 31.21443939 31.15813446 31.10230637] -> Prediction: 31.046913146972656 1/1 [==============================] - 0s 16ms/step Predicting on: [31.38777924 31.32911301 31.2713623 31.21443939 31.15813446 31.10230637 31.04691315] -> Prediction: 30.991992950439453 1/1 [==============================] - 0s 17ms/step Predicting on: [31.32911301 31.2713623 31.21443939 31.15813446 31.10230637 31.04691315 30.99199295] -> Prediction: 30.937591552734375 1/1 [==============================] - 0s 15ms/step Predicting on: [31.2713623 31.21443939 31.15813446 31.10230637 31.04691315 30.99199295 30.93759155] -> Prediction: 30.883729934692383 1/1 [==============================] - 0s 15ms/step Predicting on: [31.21443939 31.15813446 31.10230637 31.04691315 30.99199295 30.93759155 30.88372993] -> Prediction: 30.83040428161621 1/1 [==============================] - 0s 18ms/step Predicting on: [31.15813446 31.10230637 31.04691315 30.99199295 30.93759155 30.88372993 30.83040428] -> Prediction: 30.777591705322266 1/1 [==============================] - 0s 17ms/step Predicting on: [31.10230637 31.04691315 30.99199295 30.93759155 30.88372993 30.83040428 30.77759171] -> Prediction: 30.72528076171875 1/1 [==============================] - 0s 23ms/step Predicting on: [31.04691315 30.99199295 30.93759155 30.88372993 30.83040428 30.77759171 30.72528076] -> Prediction: 30.673446655273438 1/1 [==============================] - 0s 18ms/step Predicting on: [30.99199295 30.93759155 30.88372993 30.83040428 30.77759171 30.72528076 30.67344666] -> Prediction: 30.62209129333496 1/1 [==============================] - 0s 16ms/step Predicting on: [30.93759155 30.88372993 30.83040428 30.77759171 30.72528076 30.67344666 30.62209129] -> Prediction: 30.57122039794922 1/1 [==============================] - 0s 16ms/step Predicting on: [30.88372993 30.83040428 30.77759171 30.72528076 30.67344666 30.62209129 30.5712204 ] -> Prediction: 30.520824432373047 1/1 [==============================] - 0s 16ms/step Predicting on: [30.83040428 30.77759171 30.72528076 30.67344666 30.62209129 30.5712204 30.52082443] -> Prediction: 30.470895767211914 1/1 [==============================] - 0s 18ms/step Predicting on: [30.77759171 30.72528076 30.67344666 30.62209129 30.5712204 30.52082443 30.47089577] -> Prediction: 30.42143440246582 1/1 [==============================] - 0s 28ms/step Predicting on: [30.72528076 30.67344666 30.62209129 30.5712204 30.52082443 30.47089577 30.4214344 ] -> Prediction: 30.372434616088867 1/1 [==============================] - 0s 27ms/step Predicting on: [30.67344666 30.62209129 30.5712204 30.52082443 30.47089577 30.4214344 30.37243462] -> Prediction: 30.323883056640625 1/1 [==============================] - 0s 27ms/step Predicting on: [30.62209129 30.5712204 30.52082443 30.47089577 30.4214344 30.37243462 30.32388306] -> Prediction: 30.27577781677246 1/1 [==============================] - 0s 24ms/step Predicting on: [30.5712204 30.52082443 30.47089577 30.4214344 30.37243462 30.32388306 30.27577782] -> Prediction: 30.228113174438477 1/1 [==============================] - 0s 20ms/step Predicting on: [30.52082443 30.47089577 30.4214344 30.37243462 30.32388306 30.27577782 30.22811317] -> Prediction: 30.180883407592773 1/1 [==============================] - 0s 22ms/step Predicting on: [30.47089577 30.4214344 30.37243462 30.32388306 30.27577782 30.22811317 30.18088341] -> Prediction: 30.13408851623535 1/1 [==============================] - 0s 19ms/step Predicting on: [30.4214344 30.37243462 30.32388306 30.27577782 30.22811317 30.18088341 30.13408852] -> Prediction: 30.087717056274414 1/1 [==============================] - 0s 17ms/step Predicting on: [30.37243462 30.32388306 30.27577782 30.22811317 30.18088341 30.13408852 30.08771706] -> Prediction: 30.041772842407227 1/1 [==============================] - 0s 16ms/step Predicting on: [30.32388306 30.27577782 30.22811317 30.18088341 30.13408852 30.08771706 30.04177284] -> Prediction: 29.996240615844727 1/1 [==============================] - 0s 15ms/step Predicting on: [30.27577782 30.22811317 30.18088341 30.13408852 30.08771706 30.04177284 29.99624062] -> Prediction: 29.95111846923828 1/1 [==============================] - 0s 18ms/step Predicting on: [30.22811317 30.18088341 30.13408852 30.08771706 30.04177284 29.99624062 29.95111847] -> Prediction: 29.906404495239258 1/1 [==============================] - 0s 18ms/step Predicting on: [30.18088341 30.13408852 30.08771706 30.04177284 29.99624062 29.95111847 29.9064045 ] -> Prediction: 29.862092971801758
# Create new data frame and copy date and temp and add column
df_last = pd.DataFrame()
df_last = df_3[['date', 'meantemp']].copy()
df_last['Label'] = 'Historical Data'
df_last
date | meantemp | Label | |
---|---|---|---|
0 | 2013-01-01 | 10.000000 | Historical Data |
1 | 2013-01-02 | 7.400000 | Historical Data |
2 | 2013-01-03 | 7.166667 | Historical Data |
3 | 2013-01-04 | 8.666667 | Historical Data |
4 | 2013-01-05 | 6.000000 | Historical Data |
... | ... | ... | ... |
1570 | 2017-04-20 | 34.500000 | Historical Data |
1571 | 2017-04-21 | 34.250000 | Historical Data |
1572 | 2017-04-22 | 32.900000 | Historical Data |
1573 | 2017-04-23 | 32.875000 | Historical Data |
1574 | 2017-04-24 | 32.000000 | Historical Data |
1575 rows × 3 columns
future_forecast
[32.330177, 32.464855, 32.627403, 32.51569, 32.41227, 32.254322, 32.156895, 32.092724, 32.055916, 32.00875, 31.951113, 31.882454, 31.812412, 31.745281, 31.683237, 31.624018, 31.565596, 31.506609, 31.447151, 31.38778, 31.329113, 31.271362, 31.21444, 31.158134, 31.102306, 31.046913, 30.991993, 30.937592, 30.88373, 30.830404, 30.777592, 30.72528, 30.673447, 30.622091, 30.57122, 30.520824, 30.470896, 30.421434, 30.372435, 30.323883, 30.275778, 30.228113, 30.180883, 30.134089, 30.087717, 30.041773, 29.99624, 29.951118, 29.906404, 29.862093]
label_list = ["Forecast"] * len(future_forecast)
label_list
['Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast', 'Forecast']
# create date range from last date
last_date = df_last['date'].iloc[-1]
date_range = pd.date_range(last_date, periods=50, freq='D')
# use dictionary to create another dataframe
data = {'date': date_range, 'meantemp': future_forecast, 'Label': label_list}
df = pd.DataFrame(data)
# append to df_last
df_last = df_last.append(df, ignore_index=True)
C:\Users\Steve\AppData\Local\Temp\ipykernel_20444\4170091035.py:2: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead. df_last = df_last.append(df, ignore_index=True)
df_last
date | meantemp | Label | |
---|---|---|---|
0 | 2013-01-01 | 10.000000 | Historical Data |
1 | 2013-01-02 | 7.400000 | Historical Data |
2 | 2013-01-03 | 7.166667 | Historical Data |
3 | 2013-01-04 | 8.666667 | Historical Data |
4 | 2013-01-05 | 6.000000 | Historical Data |
... | ... | ... | ... |
1620 | 2017-06-08 | 30.041773 | Forecast |
1621 | 2017-06-09 | 29.996241 | Forecast |
1622 | 2017-06-10 | 29.951118 | Forecast |
1623 | 2017-06-11 | 29.906404 | Forecast |
1624 | 2017-06-12 | 29.862093 | Forecast |
1625 rows × 3 columns
# create a line plot
fig, ax = plt.subplots(figsize=(10, 6))
for label, data in df_last.groupby('Label'):
data.plot(x='date', y='meantemp', ax=ax, label=label)
# set the title and axis labels
ax.set_title('Model_3 Mean Temperature forecast')
ax.set_xlabel('Date')
ax.set_ylabel('Mean Temperature')
# show the legend
ax.legend()
plt.show()